Skip to content

[BUGFIX] fix Attention import path#33436

Closed
xuechendi wants to merge 1 commit intovllm-project:mainfrom
xuechendi:fix_after_32133
Closed

[BUGFIX] fix Attention import path#33436
xuechendi wants to merge 1 commit intovllm-project:mainfrom
xuechendi:fix_after_32133

Conversation

@xuechendi
Copy link
Contributor

@xuechendi xuechendi commented Jan 30, 2026

Purpose

The Attention/MLAAttention path used in #32133 is out of date.

Trigger bug on below test

pytest -v -s v1/structured_output/test_backend_guidance.py

error log:

File "/usr/local/lib/python3.12/dist-packages/vllm/model_executor/model_loader/utils.py", line 21, in <module>
ERROR 01-30 16:28:14 [registry.py:798]     from vllm.model_executor.model_loader.reload import (
ERROR 01-30 16:28:14 [registry.py:798]   File "/usr/local/lib/python3.12/dist-packages/vllm/model_executor/model_loader/reload/__init__.py", line 29, in <module>
ERROR 01-30 16:28:14 [registry.py:798]     from .layerwise import (
ERROR 01-30 16:28:14 [registry.py:798]   File "/usr/local/lib/python3.12/dist-packages/vllm/model_executor/model_loader/reload/layerwise.py", line 10, in <module>
ERROR 01-30 16:28:14 [registry.py:798]     from vllm.attention.layer import Attention, MLAAttention
ERROR 01-30 16:28:14 [registry.py:798] ModuleNotFoundError: No module named 'vllm.attention'

This PR is to fix this issue

Test Plan

pytest -v -s v1/structured_output/test_backend_guidance.py

Test Result


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Signed-off-by: Chendi Xue <chendi.xue@intel.com>
@xuechendi xuechendi requested a review from 22quinn as a code owner January 30, 2026 17:21
@mergify mergify bot added the bug Something isn't working label Jan 30, 2026
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly fixes a ModuleNotFoundError by updating the import path for Attention and MLAAttention. The change is straightforward and also improves import ordering. I have no further suggestions.

@mgoin
Copy link
Member

mgoin commented Jan 30, 2026

Thanks for the help @xuechendi! I will close this to prioritize #33432 since it was posted before

@mgoin mgoin closed this Jan 30, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants