[5/N][Attention] Finish eliminating vllm/attention folder#32064
Merged
ProExpertProg merged 26 commits intovllm-project:mainfrom Jan 27, 2026
Merged
[5/N][Attention] Finish eliminating vllm/attention folder#32064ProExpertProg merged 26 commits intovllm-project:mainfrom
vllm/attention folder#32064ProExpertProg merged 26 commits intovllm-project:mainfrom
Conversation
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
1 task
Contributor
There was a problem hiding this comment.
Code Review
This pull request completes the refactoring to eliminate the vllm/attention directory. The changes mostly involve moving files and splitting vllm/attention/layer.py. While the file moves are correct, a critical import path was missed during the refactoring, which will cause an ImportError. I've provided a fix for this.
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
a873e8f to
1cb4ce3
Compare
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
|
Documentation preview: https://vllm--32064.org.readthedocs.build/en/32064/ |
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
|
This pull request has merge conflicts that must be resolved before it can be |
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
|
This pull request has merge conflicts that must be resolved before it can be |
Collaborator
Author
|
Holding off to let #25954 land first |
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
ProExpertProg
approved these changes
Jan 27, 2026
VedantMadane
pushed a commit
to VedantMadane/vllm
that referenced
this pull request
Jan 28, 2026
…ject#32064) Signed-off-by: Matthew Bonanni <mbonanni@redhat.com> Signed-off-by: Vedant Madane <6527493+VedantMadane@users.noreply.github.com>
vkuzo
added a commit
to vkuzo/vllm
that referenced
this pull request
Jan 30, 2026
Summary: vllm-project#32133 missed a rebase on vllm-project#32064, fixing the attention path import Test Plan: ```bash // before this PR, the test runner failed because the old attention // import path no longer exists pytest tests/quantization/test_fp8.py -s -x ``` Reviewers: Subscribers: Tasks: Tags: Signed-off-by: vasiliy <vasiliy@fb.com>
apd10
pushed a commit
to apd10/vllm
that referenced
this pull request
Jan 31, 2026
…ject#32064) Signed-off-by: Matthew Bonanni <mbonanni@redhat.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Merge #32060 before this.
Purpose
Step 5 of #31919: This PR finishes eliminating the
vllm/attentionfolder by doing the following:vllm/attention/layer.pyintovllm/model_executor/layers/attention/mla_attention.py(MLAAttention,unified_mla_attention) andvllm/model_executor/layers/attention/attention.py(Attention,unified_attention)vllm/attention/utils/kv_sharing_utils.pycontent intovllm/model_executor/layers/attention/attention.pyvllm/attention/utils/kv_transfer_utils.pytovllm/model_executor/layers/attention/kv_transfer_utils.pyvllm/attentionfoldervllm/model_executor/layers/attention/__init__.pyto enable module-level importsTest Plan
CI (should run all tests)
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.Note
Completes migration away from
vllm/attentionby relocating attention layers and utilities undervllm/model_executor.vllm/attention/layer.pyintovllm/model_executor/layers/attention/attention.py(Attention,unified_attention) and.../mla_attention.py(MLAAttention,unified_mla_attention)validate_kv_sharing_targetfromvllm/attention/utils/kv_sharing_utils.pyinto.../attention.py; movedkv_transfer_utilsto.../layers/attention/kv_transfer_utils.pyvllm/attentionpackage; updated imports across models, backends, workers, quantization, tests, docs, and CODEOWNERS; adjusted CI source_file_dependenciesWritten by Cursor Bugbot for commit 99e5293. This will update automatically on new commits. Configure here.
Note
Completes migration away from
vllm/attentionby relocating attention code intovllm/model_executor.vllm/attention/layer.pyinto.../attention/attention.py(Attention,unified_attention) and.../attention/mla_attention.py(MLAAttention,unified_mla_attention)validate_kv_sharing_targetand movedkv_transfer_utilsinto.../layers/attention; removedvllm/attentionand old utilitiesWritten by Cursor Bugbot for commit 9942dff. This will update automatically on new commits. Configure here.
Note
Cursor Bugbot is generating a summary for commit 3111dd0. Configure here.
Note
Completes migration away from
vllm/attentiontovllm/model_executor.vllm/attention/layer.pyintovllm/model_executor/layers/attention/attention.py(Attention,unified_attention) and.../mla_attention.py(MLAAttention,unified_mla_attention), moving MLA custom-ops therevalidate_kv_sharing_targetinto.../attention.pyand moveskv_transfer_utilsto.../layers/attention/kv_transfer_utils.py; deletesvllm/attentionandvllm/attention/utils/kv_sharing_utils.pyget_attention_contextannotations, TYPE_CHECKING) to match new layoutWritten by Cursor Bugbot for commit 3111dd0. This will update automatically on new commits. Configure here.
Note
Cursor Bugbot is generating a summary for commit 8b56809. Configure here.
Note
Completes migration away from
vllm/attentiontovllm/model_executor.vllm/attention/layer.pyinto.../attention/attention.py(Attention,unified_attention) and.../attention/mla_attention.py(MLAAttention,unified_mla_attention); moves MLA custom-opsvalidate_kv_sharing_targetand moveskv_transfer_utilsinto.../layers/attention; deletesvllm/attentionandvllm/attention/utils/kv_sharing_utils.pyTYPE_CHECKING, annotations).../layers/attention; mypy config stops listing the removed packageWritten by Cursor Bugbot for commit 8b56809. This will update automatically on new commits. Configure here.