Skip to content

[Doc] Add release note for v0.11.0rc1#3931

Merged
wangxiyuan merged 1 commit intovllm-project:mainfrom
wangxiyuan:add_releasenote_0110rc1
Nov 10, 2025
Merged

[Doc] Add release note for v0.11.0rc1#3931
wangxiyuan merged 1 commit intovllm-project:mainfrom
wangxiyuan:add_releasenote_0110rc1

Conversation

@wangxiyuan
Copy link
Copy Markdown
Collaborator

@wangxiyuan wangxiyuan commented Oct 31, 2025

Add release note for v0.11.0rc1.

@github-actions
Copy link
Copy Markdown
Contributor

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

@github-actions github-actions Bot added the documentation Improvements or additions to documentation label Oct 31, 2025
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds the release notes for v0.11.0rc1. The changes are limited to documentation. I have found a potentially misleading statement regarding a feature's default enablement status, which could cause confusion for users. Please see the specific comment for details.

Comment thread docs/source/user_guide/release_notes.md Outdated
Comment thread docs/source/user_guide/release_notes.md Outdated
Comment thread docs/source/user_guide/release_notes.md Outdated
Comment thread docs/source/user_guide/release_notes.md Outdated
### Highlights
- Deepseek series models work with aclgraph now.
- PrefixCache and Chunked Prefill are enabled by default.
- W4A8 quantization is supported now.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can highlight our refactor efforts here? After a series of arduous efforts, it is no longer necessary to re-register language models in vLLM-Ascend now :)

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

release note is for end user, the model refactor doesn't affect users. I'll mention it in others section

Comment thread docs/source/user_guide/release_notes.md Outdated
v0.11.0 will be the next official release version of vLLM Ascend. We'll release it in the next few days. Any feedback is welcome to help us improve v0.11.0.

### Highlights
- Deepseek series models work with aclgraph now.
Copy link
Copy Markdown
Collaborator

@whx-sjtu whx-sjtu Oct 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this release include aclgraph support for deepseek v3.2? Maybe it's better to narrow the scope of Deepseek series model, like Deepseek V2/V3. Deepseek series model also includes DeepSeek Ocr which might cause misunderstanding.

Comment thread docs/source/user_guide/release_notes.md Outdated
Comment thread docs/source/user_guide/release_notes.md Outdated
### Core
- Performance of Qwen and Deepseek series models are improved.
- Mooncake store connector, Mooncake layerwise connector, CPU offload connector are supported now
- MTP > 1 is supported now.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to vLLM v0.11.0 don't support MTP > 1, we need to set other param to support this case, and this will be explained in the MTP developer guide. Should we add a redirect link here to explain this matter?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it's fine to add any link

Comment thread docs/source/user_guide/release_notes.md Outdated
@wangxiyuan wangxiyuan force-pushed the add_releasenote_0110rc1 branch from a836489 to 382a074 Compare November 4, 2025 01:14
Copy link
Copy Markdown
Collaborator

@whx-sjtu whx-sjtu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wangxiyuan wangxiyuan force-pushed the add_releasenote_0110rc1 branch 4 times, most recently from d3d53e7 to f486a59 Compare November 10, 2025 06:26
Comment thread docs/source/community/versioning_policy.md
Comment thread docs/source/user_guide/release_notes.md Outdated
Comment thread docs/source/user_guide/release_notes.md Outdated
Comment thread docs/source/user_guide/release_notes.md Outdated
- W4A4 quantization is supported now. [#3427](https://github.com/vllm-project/vllm-ascend/pull/3427)

### Core
- Performance of Qwen and Deepseek series models are improved.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

give a link or doc here

Comment thread docs/source/user_guide/release_notes.md Outdated
- LLMDatadist KV Connector is deprecated. We'll remove it in Q1 2026.
- Refactor the linear module to support features flashcomm1 and flashcomm2 in paper [flashcomm](https://arxiv.org/pdf/2412.04964) [#3004](https://github.com/vllm-project/vllm-ascend/pull/3004) [#3334](https://github.com/vllm-project/vllm-ascend/pull/3334)

### Known issue
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Each known issue should have a related issue link

@wangxiyuan wangxiyuan force-pushed the add_releasenote_0110rc1 branch 2 times, most recently from d53b621 to 9251180 Compare November 10, 2025 12:42
Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
@wangxiyuan wangxiyuan force-pushed the add_releasenote_0110rc1 branch from 9251180 to 2330b25 Compare November 10, 2025 12:49
Comment thread docs/source/faqs.md

- [[v0.9.1] FAQ & Feedback](https://github.com/vllm-project/vllm-ascend/issues/2643)
- [[v0.11.0rc0] FAQ & Feedback](https://github.com/vllm-project/vllm-ascend/issues/3222)
- [[v0.11.0rc1] FAQ & Feedback](https://github.com/vllm-project/vllm-ascend/issues/3222)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

plz update the issue number

@wangxiyuan wangxiyuan merged commit 64220c6 into vllm-project:main Nov 10, 2025
17 checks passed
@wangxiyuan wangxiyuan deleted the add_releasenote_0110rc1 branch November 18, 2025 08:58
luolun pushed a commit to luolun/vllm-ascend that referenced this pull request Nov 19, 2025
Add release note for v0.11.0rc1.


- vLLM version: v0.11.0
- vLLM main:
vllm-project/vllm@83f478b

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: luolun <luolun1995@cmbchina.com>
hwhaokun pushed a commit to hwhaokun/vllm-ascend that referenced this pull request Nov 19, 2025
Add release note for v0.11.0rc1.

- vLLM version: v0.11.0
- vLLM main:
vllm-project/vllm@83f478b

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: hwhaokun <haokun0405@163.com>
NSDie pushed a commit to NSDie/vllm-ascend that referenced this pull request Nov 24, 2025
Add release note for v0.11.0rc1.

- vLLM version: v0.11.0
- vLLM main:
vllm-project/vllm@83f478b

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Signed-off-by: nsdie <yeyifan@huawei.com>
Clorist33 pushed a commit to Clorist33/vllm-ascend that referenced this pull request Dec 10, 2025
Add release note for v0.11.0rc1.


- vLLM version: v0.11.0
- vLLM main:
vllm-project/vllm@83f478b

Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants