Upgrade vLLM to v0.10.0#1927
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1927 +/- ##
==========================================
+ Coverage 73.12% 73.16% +0.04%
==========================================
Files 90 90
Lines 9956 9929 -27
==========================================
- Hits 7280 7265 -15
+ Misses 2676 2664 -12
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
This pull request has conflicts, please resolve those before we can evaluate the pull request. |
5c87c7c to
3ca3946
Compare
|
This pull request has conflicts, please resolve those before we can evaluate the pull request. |
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
|
To speed up CI I also run test on #2030 I suggested to upgrade this after |
|
@ganyi1996ppo @jianzs @wangxiyuan Please |
wangxiyuan
left a comment
There was a problem hiding this comment.
OK, let's merge asap to unblock other PRs
### What this PR does / why we need it? - Upgrade to v0.10.0 - Drop v0.9.2 version compatibility - Add patch for `vllm_ascend/patch/worker/patch_common/patch_sampler_gather_logprobs.py` as workaround of vllm-project/vllm@f3a683b for v0.10.0 and also add e2e test `test_models_prompt_logprobs` - Pin transformers<4.54.0 as workaround of vllm-project#2034 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Test locally: `VLLM_USE_MODELSCOPE=true pytest -sv tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobs` - CI passed - vLLM version: v0.9.2 - vLLM main: vllm-project/vllm@7728dd7 --------- Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
### What this PR does / why we need it? - Upgrade to v0.10.0 - Drop v0.9.2 version compatibility - Add patch for `vllm_ascend/patch/worker/patch_common/patch_sampler_gather_logprobs.py` as workaround of vllm-project/vllm@f3a683b for v0.10.0 and also add e2e test `test_models_prompt_logprobs` - Pin transformers<4.54.0 as workaround of vllm-project#2034 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Test locally: `VLLM_USE_MODELSCOPE=true pytest -sv tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobs` - CI passed - vLLM version: v0.9.2 - vLLM main: vllm-project/vllm@7728dd7 --------- Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
### What this PR does / why we need it? - Upgrade to v0.10.0 - Drop v0.9.2 version compatibility - Add patch for `vllm_ascend/patch/worker/patch_common/patch_sampler_gather_logprobs.py` as workaround of vllm-project/vllm@f3a683b for v0.10.0 and also add e2e test `test_models_prompt_logprobs` - Pin transformers<4.54.0 as workaround of vllm-project#2034 ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Test locally: `VLLM_USE_MODELSCOPE=true pytest -sv tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobs` - CI passed - vLLM version: v0.9.2 - vLLM main: vllm-project/vllm@7728dd7 --------- Signed-off-by: Yikun Jiang <yikunkero@gmail.com>

What this PR does / why we need it?
vllm_ascend/patch/worker/patch_common/patch_sampler_gather_logprobs.pyas workaround of vllm-project/vllm@f3a683b for v0.10.0 and also add e2e testtest_models_prompt_logprobsDoes this PR introduce any user-facing change?
No
How was this patch tested?
Test locally:
VLLM_USE_MODELSCOPE=true pytest -sv tests/e2e/singlecard/test_offline_inference.py::test_models_prompt_logprobsCI passed
vLLM version: v0.9.2
vLLM main: vllm-project/vllm@7728dd7