Skip to content

Upgrade vLLM to v0.10.0#1927

Merged
wangxiyuan merged 4 commits intovllm-project:mainfrom
Yikun:upgrade-v0.10
Jul 26, 2025
Merged

Upgrade vLLM to v0.10.0#1927
wangxiyuan merged 4 commits intovllm-project:mainfrom
Yikun:upgrade-v0.10

Conversation

@Yikun
Copy link
Copy Markdown
Member

@Yikun Yikun commented Jul 22, 2025

What this PR does / why we need it?

Does this PR introduce any user-facing change?

No

How was this patch tested?

  • Test locally:
    VLLM_USE_MODELSCOPE=true pytest -sv tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobs

  • CI passed

  • vLLM version: v0.9.2

  • vLLM main: vllm-project/vllm@7728dd7

@github-actions github-actions Bot added the documentation Improvements or additions to documentation label Jul 22, 2025
@Yikun Yikun added accuracy-test enable all accuracy test for PR performance-test enable performance test for PR ready-for-test start test by label for PR and removed documentation Improvements or additions to documentation labels Jul 22, 2025
@codecov
Copy link
Copy Markdown

codecov Bot commented Jul 22, 2025

Codecov Report

❌ Patch coverage is 81.81818% with 2 lines in your changes missing coverage. Please review.
✅ Project coverage is 73.16%. Comparing base (e561a2c) to head (1b21dc4).
⚠️ Report is 592 commits behind head on main.

Files with missing lines Patch % Lines
vllm_ascend/patch/platform/__init__.py 50.00% 1 Missing ⚠️
vllm_ascend/patch/worker/__init__.py 50.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1927      +/-   ##
==========================================
+ Coverage   73.12%   73.16%   +0.04%     
==========================================
  Files          90       90              
  Lines        9956     9929      -27     
==========================================
- Hits         7280     7265      -15     
+ Misses       2676     2664      -12     
Flag Coverage Δ
unittests 73.16% <81.81%> (+0.04%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@github-actions github-actions Bot added documentation Improvements or additions to documentation module:tests labels Jul 24, 2025
@Yikun Yikun changed the title Upgrade to v0.10.0 Upgrade vLLM to v0.10.0 Jul 24, 2025
@Yikun Yikun added ready-for-test start test by label for PR and removed ready-for-test start test by label for PR labels Jul 24, 2025
@github-actions
Copy link
Copy Markdown
Contributor

This pull request has conflicts, please resolve those before we can evaluate the pull request.

@Yikun Yikun added ready-for-test start test by label for PR and removed ready-for-test start test by label for PR merge-conflicts labels Jul 25, 2025
@Yikun Yikun added ready-for-test start test by label for PR and removed ready-for-test start test by label for PR labels Jul 25, 2025
@Yikun Yikun force-pushed the upgrade-v0.10 branch 4 times, most recently from 5c87c7c to 3ca3946 Compare July 25, 2025 09:05
@Yikun Yikun added ready-for-test start test by label for PR and removed ready-for-test start test by label for PR labels Jul 25, 2025
Comment thread vllm_ascend/patch/platform/__init__.py Outdated
Comment thread vllm_ascend/patch/worker/__init__.py
@Yikun Yikun added ready-for-test start test by label for PR and removed ready-for-test start test by label for PR labels Jul 25, 2025
@Yikun Yikun added ready-for-test start test by label for PR and removed ready-for-test start test by label for PR labels Jul 25, 2025
@github-actions
Copy link
Copy Markdown
Contributor

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Yikun added 2 commits July 26, 2025 08:28
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
@Yikun Yikun marked this pull request as ready for review July 26, 2025 04:08
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
@Yikun
Copy link
Copy Markdown
Member Author

Yikun commented Jul 26, 2025

To speed up CI I also run test on #2030

I suggested to upgrade this after
https://github.com/vllm-project/vllm-ascend/actions/runs/16537010387?pr=2030
passed

@Yikun
Copy link
Copy Markdown
Member Author

Yikun commented Jul 26, 2025

image

@Yikun
Copy link
Copy Markdown
Member Author

Yikun commented Jul 26, 2025

@ganyi1996ppo @jianzs @wangxiyuan Please

Copy link
Copy Markdown
Collaborator

@wangxiyuan wangxiyuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, let's merge asap to unblock other PRs

@wangxiyuan wangxiyuan merged commit 17a430f into vllm-project:main Jul 26, 2025
23 of 25 checks passed
chopper0126 pushed a commit to chopper0126/vllm-ascend that referenced this pull request Sep 26, 2025
### What this PR does / why we need it?
- Upgrade to v0.10.0
- Drop v0.9.2 version compatibility
- Add patch for
`vllm_ascend/patch/worker/patch_common/patch_sampler_gather_logprobs.py`
as workaround of
vllm-project/vllm@f3a683b
for v0.10.0 and also add e2e test `test_models_prompt_logprobs`
- Pin transformers<4.54.0 as workaround of
vllm-project#2034

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
- Test locally:
`VLLM_USE_MODELSCOPE=true pytest -sv
tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobs`
- CI passed

- vLLM version: v0.9.2
- vLLM main:
vllm-project/vllm@7728dd7

---------

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Oct 21, 2025
### What this PR does / why we need it?
- Upgrade to v0.10.0
- Drop v0.9.2 version compatibility
- Add patch for
`vllm_ascend/patch/worker/patch_common/patch_sampler_gather_logprobs.py`
as workaround of
vllm-project/vllm@f3a683b
for v0.10.0 and also add e2e test `test_models_prompt_logprobs`
- Pin transformers<4.54.0 as workaround of
vllm-project#2034

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
- Test locally:
`VLLM_USE_MODELSCOPE=true pytest -sv
tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobs`
- CI passed

- vLLM version: v0.9.2
- vLLM main:
vllm-project/vllm@7728dd7

---------

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
Clorist33 pushed a commit to Clorist33/vllm-ascend that referenced this pull request Dec 9, 2025
### What this PR does / why we need it?
- Upgrade to v0.10.0
- Drop v0.9.2 version compatibility
- Add patch for
`vllm_ascend/patch/worker/patch_common/patch_sampler_gather_logprobs.py`
as workaround of
vllm-project/vllm@f3a683b
for v0.10.0 and also add e2e test `test_models_prompt_logprobs`
- Pin transformers<4.54.0 as workaround of
vllm-project#2034

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
- Test locally:
`VLLM_USE_MODELSCOPE=true pytest -sv
tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobs`
- CI passed

- vLLM version: v0.9.2
- vLLM main:
vllm-project/vllm@7728dd7

---------

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

accuracy-test enable all accuracy test for PR documentation Improvements or additions to documentation module:tests performance-test enable performance test for PR ready-for-test start test by label for PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants