Skip to content

[Doc] backport 0.13.0 release note#6584

Merged
wangxiyuan merged 1 commit intovllm-project:mainfrom
wangxiyuan:backport_0130
Feb 6, 2026
Merged

[Doc] backport 0.13.0 release note#6584
wangxiyuan merged 1 commit intovllm-project:mainfrom
wangxiyuan:backport_0130

Conversation

@wangxiyuan
Copy link
Copy Markdown
Collaborator

@wangxiyuan wangxiyuan commented Feb 6, 2026

What this PR does / why we need it?

Backport 0.13.0 release note to main branch and update related doc link

Does this PR introduce any user-facing change?

yes

How was this patch tested?

by doc CI

@github-actions github-actions bot added the documentation Improvements or additions to documentation label Feb 6, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 6, 2026

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello @wangxiyuan, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request focuses on updating the project's documentation to officially announce and detail the vLLM Ascend v0.13.0 stable release. It ensures that all user-facing information, from the main project overview to specific versioning policies and FAQs, accurately reflects the latest stable version. The core impact is providing users with a complete and up-to-date resource for understanding the significant advancements and changes introduced in v0.13.0.

Highlights

  • Official v0.13.0 Release Integration: The pull request formally integrates the release notes for vLLM Ascend version 0.13.0, marking its transition from a release candidate to a stable version across all relevant documentation.
  • Comprehensive Documentation Updates: Various documentation files, including README.md, README.zh.md, docs/source/_templates/sections/header.html, docs/source/community/versioning_policy.md, and docs/source/faqs.md, have been updated to reflect the new v0.13.0 stable release and its associated links and compatibility information.
  • Detailed v0.13.0 Release Notes Added: A comprehensive section detailing the v0.13.0 release notes has been added to docs/source/user_guide/release_notes.md, covering extensive updates in model support, core features, hardware/operator support, performance, dependencies, deprecations, and known issues.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • README.md
    • Added a new 'Latest News' entry for the v0.13.0 release, dated 2026/02.
    • Updated the version table to designate v0.13.0 as the 'Latest stable version', replacing v0.13.0rc2 and removing v0.11.0.
  • README.zh.md
    • Added a new '最新消息' (Latest News) entry for the v0.13.0 release, dated 2026/02.
    • Updated the version table to designate v0.13.0 as the '最新正式/稳定版本' (Latest official/stable version), replacing v0.13.0rc2 and removing v0.11.0.
  • docs/source/_templates/sections/header.html
    • Updated the notification bar link to point to the documentation for the v0.13.0 stable release.
  • docs/source/community/versioning_policy.md
    • Added a new entry for v0.13.0 in the release compatibility matrix, specifying its vLLM, Python, CANN, PyTorch/torch_npu, and Triton Ascend versions.
    • Adjusted the vLLM version for v0.14.0rc1 from 'v0.14.0/v0.14.1' to 'v0.14.1'.
    • Added a new release event entry for the 'v0.13.0 Final release' on 2026.02.06.
  • docs/source/faqs.md
    • Updated the FAQ link for v0.14.0rc1.
    • Replaced the FAQ link for v0.13.0rc2 with a new link for the official v0.13.0 release.
    • Removed the FAQ link for v0.11.0.
  • docs/source/user_guide/release_notes.md
    • Inserted a comprehensive new section at the beginning of the file detailing the release notes for v0.13.0, covering model support, core features, hardware and operator support, performance improvements, dependency upgrades, deprecation and breaking changes, documentation updates, and known issues.
Activity
  • The pull request title indicates a documentation update to backport the 0.13.0 release note.
  • The pull request body is empty, suggesting no additional context or discussion from the author beyond the title.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request backports the release notes and documentation updates for version v0.13.0. The changes include updating the README files, the documentation header, versioning policy, FAQs, and adding the detailed release notes for v0.13.0. The changes are mostly correct, but I've pointed out a couple of links in the release notes that should be updated to point to the version-specific documentation instead of the 'latest' version to avoid future inconsistencies.

Comment thread docs/source/user_guide/release_notes.md Outdated
Comment thread docs/source/user_guide/release_notes.md Outdated
@wangxiyuan wangxiyuan force-pushed the backport_0130 branch 2 times, most recently from 9487cd4 to 6b47df8 Compare February 6, 2026 02:01
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
@wangxiyuan wangxiyuan merged commit c38166e into vllm-project:main Feb 6, 2026
10 checks passed
845473182 pushed a commit to 845473182/vllm-ascend that referenced this pull request Feb 6, 2026
…to qwen3next_rebase

* 'main' of https://github.com/vllm-project/vllm-ascend: (59 commits)
  [Feat.]: 310p support MOE models (vllm-project#6530)
  [Doc] backport 0.13.0 release note (vllm-project#6584)
  [CI] Update UT CANN version to 8.5.0 for main branch (vllm-project#6564)
  [CI] Change A2 runner (vllm-project#6557)
  [Bugfix] Fix the incorrect use of the output parameter in _forward_fia_slidingwindow (vllm-project#6469)
  [main2main] upgrade vllm main 0202 (vllm-project#6560)
  [CI][npugraph_ex]Fix npugraph ex e2e test (vllm-project#6553)
  [Feature]KV pool supports sparse attention (vllm-project#6339)
  [bugfix]Fix accuracy issue in PCP/DCP with speculative decoding (vllm-project#6491)
  perf: adaptive block size selection in linear_persistent kernel (vllm-project#6537)
  [ModelRunner][Fix] Pads query_start_loc to satisfy FIA/TND constraint (vllm-project#6475)
  [Bugfix]Fix of Pooling Code and Update of Pooling Usage Guide (vllm-project#6126)
  [Fusion] Add rmsnorm dynamic quant fusion pass (vllm-project#6274)
  [Bugfix] Synchronize only the current stream to avoid device sync (vllm-project#6432)
  [CI] Add long and short prompt tests for DeepSeek-V3.2 (vllm-project#6499)
  [Refactor] MLP weight prefetch to consistency with MoE Model's prefetching in terms of code and usage (vllm-project#6442)
  [bugfix][npugraph_ex]duplicate pattern issue (vllm-project#6513)
  [bugfix][npugraph_ex]add the extra check for allreduce rmsnorm fusion pass (vllm-project#6430)
  [Quant] GLM4.7-Flash Support W8A8 (vllm-project#6492)
  [Nightly][BugFix] Remove kv_cache nz test case for test_mla_preprocess_nq.py (vllm-project#6505)
  ...
chenchuw886 pushed a commit to chenchuw886/vllm-ascend that referenced this pull request Feb 12, 2026
### What this PR does / why we need it?
Backport 0.13.0 release note to main branch and update related doc link

### Does this PR introduce _any_ user-facing change?
yes
### How was this patch tested?
by doc CI

- vLLM version: v0.15.0
- vLLM main:
vllm-project/vllm@d7e17aa

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: momochenchuw <chenchuw@huawei.com>
@wangxiyuan wangxiyuan mentioned this pull request Feb 24, 2026
ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Feb 28, 2026
### What this PR does / why we need it?
Backport 0.13.0 release note to main branch and update related doc link

### Does this PR introduce _any_ user-facing change?
yes
### How was this patch tested?
by doc CI

- vLLM version: v0.15.0
- vLLM main:
vllm-project/vllm@d7e17aa

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
maoxx241 pushed a commit to maoxx241/vllm-ascend that referenced this pull request Mar 2, 2026
### What this PR does / why we need it?
Backport 0.13.0 release note to main branch and update related doc link

### Does this PR introduce _any_ user-facing change?
yes
### How was this patch tested?
by doc CI

- vLLM version: v0.15.0
- vLLM main:
vllm-project/vllm@d7e17aa

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
ZRJ026 pushed a commit to ZRJ026/vllm-ascend that referenced this pull request Mar 4, 2026
### What this PR does / why we need it?
Backport 0.13.0 release note to main branch and update related doc link

### Does this PR introduce _any_ user-facing change?
yes
### How was this patch tested?
by doc CI

- vLLM version: v0.15.0
- vLLM main:
vllm-project/vllm@d7e17aa

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
LCAIZJ pushed a commit to LCAIZJ/vllm-ascend that referenced this pull request Mar 7, 2026
### What this PR does / why we need it?
Backport 0.13.0 release note to main branch and update related doc link

### Does this PR introduce _any_ user-facing change?
yes
### How was this patch tested?
by doc CI

- vLLM version: v0.15.0
- vLLM main:
vllm-project/vllm@d7e17aa

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant