[ROCm] Add Aiter PagedAttention with Sliding Window support#28719
Closed
sammysun0711 wants to merge 9 commits intovllm-project:mainfrom
Closed
[ROCm] Add Aiter PagedAttention with Sliding Window support#28719sammysun0711 wants to merge 9 commits intovllm-project:mainfrom
Sliding Window support#28719sammysun0711 wants to merge 9 commits intovllm-project:mainfrom
Conversation
Signed-off-by: Xiake Sun <xiake.sun@amd.com>
Signed-off-by: Xiake Sun <xiake.sun@amd.com>
…mtp parameter Signed-off-by: Xiake Sun <xiake.sun@amd.com>
Signed-off-by: Xiake Sun <xiake.sun@amd.com>
Signed-off-by: Xiake Sun <xiake.sun@amd.com>
Signed-off-by: Xiake Sun <xiake.sun@amd.com>
5 tasks
Contributor
Author
|
Sorry, need to close due to rebase issue, continue in new PR: #29065. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Purpose
This PR aim to add Aiter PagedAttention (PA) with
sliding windowsupport, which can fixgoogle/gemma-3-27b-itbfloat16 model accuracy issue with Aiter PA.google/gemma-3-27b-ittrained with additional parameter"sliding_window": 1024.Triton unified attention (default)
unified_attentionpassed sliding windows as input:vllm/vllm/v1/attention/backends/triton_attn.py
Lines 346 to 366 in 30700b1
Aiter PA
torch.ops.aiter.paged_attention_v1does not pass sliding_windows as input:vllm/vllm/v1/attention/backends/rocm_aiter_fa.py
Lines 810 to 828 in 30700b1
If input prompt token is large than
1024, missing handlingsliding_windowscause the gemma3 accuracy degradation with Aiter PA.To fixed gemma3 accuracy issue, following 3 PR required:
sliding_windowsparameter totorch.ops.aiter.paged_attention_v1Open as draft PR for now since it depends on other 2 PRs.
Test Plan
lm_eval test with
gsm8kdatasetTest Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.