Skip to content

Upgrade vllm commit hash to 1216#5053

Merged
wangxiyuan merged 15 commits intovllm-project:mainfrom
Toneymiller:fix_1216
Dec 17, 2025
Merged

Upgrade vllm commit hash to 1216#5053
wangxiyuan merged 15 commits intovllm-project:mainfrom
Toneymiller:fix_1216

Conversation

@Toneymiller
Copy link
Copy Markdown
Contributor

@Toneymiller Toneymiller commented Dec 16, 2025

What this PR does / why we need it?

Upstream vLLM PR #30212 vllm-project/vllm#30212 and vllm-project/vllm#29646
refactored the attention backend selection interface, This PR adapts vllm-ascend's get_attn_backend_cls to align with the new upstream standard, ensuring compatibility and reducing maintenance overhead.

Does this PR introduce any user-facing change?

How was this patch tested?

co-author:[leo-pony]nengjunma@outlook.com

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the vLLM commit hash in the versioning_policy.md documentation file. I have added one suggestion to format the commit hash as a clickable link to improve the documentation's usability and help prevent potential issues for users. Additionally, for future pull requests, please provide a descriptive title and fill out the description template to give reviewers better context, as the current title is unclear and the description is empty.

| vLLM Ascend | vLLM | Python | Stable CANN | PyTorch/torch_npu |
|-------------|--------------|------------------|-------------|--------------------|
| main | 4429d934de3c5cc327b0d7aec8e473aeba38db90, v0.12.0 tag | >= 3.10, < 3.12 | 8.3.RC2 | 2.8.0 / 2.8.0 |
| main | 6063853ead105f89400770b4fe6760959796d827, v0.12.0 tag | >= 3.10, < 3.12 | 8.3.RC2 | 2.8.0 / 2.8.0 |
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Hard-coding a full 40-character commit hash is error-prone. A typo could lead users to an incorrect commit, causing significant issues when they try to set up their environment based on this documentation. To improve usability and reduce this risk, I suggest making this hash a clickable link to the commit on GitHub. This allows for easy verification.

Suggested change
| main | 6063853ead105f89400770b4fe6760959796d827, v0.12.0 tag | >= 3.10, < 3.12 | 8.3.RC2 | 2.8.0 / 2.8.0 |
| main | [6063853e](https://github.com/vllm-project/vllm/commit/6063853ead105f89400770b4fe6760959796d827), v0.12.0 tag | >= 3.10, < 3.12 | 8.3.RC2 | 2.8.0 / 2.8.0 |

@github-actions github-actions Bot added documentation Improvements or additions to documentation ci/build labels Dec 16, 2025
@github-actions
Copy link
Copy Markdown
Contributor

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

Comment thread vllm_ascend/platform.py Outdated
use_sparse = kwargs["attn_selector_config"].use_sparse
else:
use_mla = kwargs.get("use_mla", args[4] if len(args) >= 5 else None)
use_sparse = kwargs.get("use_sparse", args[5] if len(args) >= 6 else None)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use_sparse应该是6 index下标

@leo-pony
Copy link
Copy Markdown
Collaborator

What this PR does / why we need it?

Does this PR introduce any user-facing change?

How was this patch tested?

Please paste the PR in vllm than break vllm-ascend

@wangxiyuan wangxiyuan added ready read for review ready-for-test start test by label for PR labels Dec 16, 2025
@wangxiyuan
Copy link
Copy Markdown
Collaborator

please fix the CI error

Toneymiller and others added 12 commits December 16, 2025 11:31
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
@wangxiyuan
Copy link
Copy Markdown
Collaborator

I‘ll fix the UT later

@wangxiyuan wangxiyuan merged commit b1a853b into vllm-project:main Dec 17, 2025
23 of 25 checks passed
wangxiyuan added a commit that referenced this pull request Dec 17, 2025
Fix broken ut introduced by #5053 

- vLLM version: v0.12.0
- vLLM main:
vllm-project/vllm@ad32e3e

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
chenaoxuan pushed a commit to chenaoxuan/vllm-ascend that referenced this pull request Dec 20, 2025
### What this PR does / why we need it?
Upstream vLLM PR #30212 vllm-project/vllm#30212
refactored the attention backend selection interface, This PR adapts
vllm-ascend's get_attn_backend_cls to align with the new upstream
standard, ensuring compatibility and reducing maintenance overhead.
### Does this PR introduce _any_ user-facing change?

### How was this patch tested?

co-author:[leo-pony][nengjunma@outlook.com](mailto:nengjunma@outlook.com)
- vLLM version: v0.12.0
- vLLM main:
vllm-project/vllm@ad32e3e

---------

Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
Co-authored-by: leo-pony <nengjunma@outlook.com>
chenaoxuan pushed a commit to chenaoxuan/vllm-ascend that referenced this pull request Dec 20, 2025
Fix broken ut introduced by vllm-project#5053 

- vLLM version: v0.12.0
- vLLM main:
vllm-project/vllm@ad32e3e

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
@Yikun
Copy link
Copy Markdown
Member

Yikun commented Dec 28, 2025

Thanks for your first contributions! Your awesome first PR has been included in vLLM Ascend v0.13.0rc1 release.

[1] https://github.com/vllm-project/vllm-ascend/releases/tag/v0.13.0rc1
[2] https://mp.weixin.qq.com/s/3Psz3mYFTLktgSEDGqM9wQ

ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Feb 28, 2026
### What this PR does / why we need it?
Upstream vLLM PR #30212 vllm-project/vllm#30212
refactored the attention backend selection interface, This PR adapts
vllm-ascend's get_attn_backend_cls to align with the new upstream
standard, ensuring compatibility and reducing maintenance overhead.
### Does this PR introduce _any_ user-facing change?

### How was this patch tested?

co-author:[leo-pony][nengjunma@outlook.com](mailto:nengjunma@outlook.com)
- vLLM version: v0.12.0
- vLLM main:
vllm-project/vllm@ad32e3e

---------

Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
Co-authored-by: leo-pony <nengjunma@outlook.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Feb 28, 2026
Fix broken ut introduced by vllm-project#5053

- vLLM version: v0.12.0
- vLLM main:
vllm-project/vllm@ad32e3e

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Mar 4, 2026
### What this PR does / why we need it?
Upstream vLLM PR #30212 vllm-project/vllm#30212
refactored the attention backend selection interface, This PR adapts
vllm-ascend's get_attn_backend_cls to align with the new upstream
standard, ensuring compatibility and reducing maintenance overhead.
### Does this PR introduce _any_ user-facing change?

### How was this patch tested?

co-author:[leo-pony][nengjunma@outlook.com](mailto:nengjunma@outlook.com)
- vLLM version: v0.12.0
- vLLM main:
vllm-project/vllm@ad32e3e

---------

Signed-off-by: zxwang <1476209578@qq.com>
Signed-off-by: leo-pony <nengjunma@outlook.com>
Co-authored-by: leo-pony <nengjunma@outlook.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Mar 4, 2026
Fix broken ut introduced by vllm-project#5053

- vLLM version: v0.12.0
- vLLM main:
vllm-project/vllm@ad32e3e

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci/build documentation Improvements or additions to documentation ready read for review ready-for-test start test by label for PR vllm-break

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants