[Patch] Remove the patch of MiniCPM#5975
Conversation
Signed-off-by: gcanlin <canlinguosdu@gmail.com>
|
👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:
If CI fails, you can run linting and testing checks locally according Contributing and Testing. |
There was a problem hiding this comment.
Code Review
This pull request removes an obsolete patch for MiniCPMAttention, which is a good cleanup as it simplifies the codebase. However, the corresponding test file tests/ut/patch/worker/patch_common/test_patch_minicpm.py has not been removed. This will cause an ImportError and break the test suite. Please remove the test file as well.
I am having trouble creating individual review comments. Click here to see my feedback.
vllm_ascend/patch/worker/patch_minicpm.py (18-36)
With the removal of this patch file, the corresponding test file tests/ut/patch/worker/patch_common/test_patch_minicpm.py should also be removed. The test file imports from this patch file, and its removal without deleting the test file will lead to an ImportError, causing the CI to fail.
Signed-off-by: gcanlin <canlinguosdu@gmail.com>
Signed-off-by: gcanlin <canlinguosdu@gmail.com>
Signed-off-by: gcanlin <canlinguosdu@gmail.com>
|
Waiting for main branch upgrading to 0118. Then we can merge this PR. |
|
let's block this change until the vllm change works with vllm ascend on main |
|
@wangxiyuan Ready to merge now. |
|
no, the vllm commit vllm-project/vllm@fe36bf5 is not included in v0.14.1. Let's merge this once we upgrade to 0.15 |
Oh, okay. We're still keeping compatibility with 0.14.1. |
|
This pull request has conflicts, please resolve those before we can evaluate the pull request. |
|
this PR can be rebased and merged now. @gcanlin |
Done. |
…to qwen3next_rebase * 'main' of https://github.com/vllm-project/vllm-ascend: [Patch] Remove the patch of MiniCPM (vllm-project#5975) [P/D] layerwise connector support recompute scheduler (vllm-project#5900) [CI] Add workflow support for lint image build (vllm-project#6489) [Bugfix] Fix problematic dummy_run & improper input_batch_size in eagle (vllm-project#6517) [Refactor]310p_e2e test case update (vllm-project#6539) [Refactor]refactor p2p connector (vllm-project#6551) [Refactor]refactor 310p attention impl and add ut (vllm-project#6579) [Refactor]refactor 310p ops and add ut (vllm-project#6591) [Ops][Refactor] Remove custom rotary_embedding operator (vllm-project#6523) [Lint]Style: Convert `vllm-ascend/` to ruff format(new Batch vllm-project#8) (vllm-project#6604) [Test] Add initial multi modal cases of Qwen2.5-VL-7B-Instruct for disaggregated encoder (vllm-project#5301) [CI] Fix broken CI (vllm-project#6599) [Lint]Style: Convert `vllm-ascend/` to ruff format(Batch vllm-project#10) (vllm-project#6173) [Lint]Style: Convert `vllm-ascend/` to ruff format(Batch vllm-project#11) (vllm-project#6176) [Lint]Style: Convert `vllm-ascend/` to ruff format(Batch vllm-project#8) (vllm-project#6129) [Lint]Style: Convert `vllm-ascend/` to ruff format(Batch vllm-project#7) (vllm-project#6023) [CI][Misc] Some improvement for github action (vllm-project#6587) [Image] Bump mooncake version to v0.3.8.post1 (vllm-project#6428)
### What this PR does / why we need it? Part of vllm-project#5304. After vllm-project/vllm#32523 merge, we could remove the patch of `MiniCPMAttention`. ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? Test it locally. - vLLM version: v0.13.0 - vLLM main: vllm-project/vllm@2c24bc6 --------- Signed-off-by: gcanlin <canlinguosdu@gmail.com> Signed-off-by: momochenchuw <chenchuw@huawei.com>
### What this PR does / why we need it? Part of vllm-project#5304. After vllm-project/vllm#32523 merge, we could remove the patch of `MiniCPMAttention`. ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? Test it locally. - vLLM version: v0.13.0 - vLLM main: vllm-project/vllm@2c24bc6 --------- Signed-off-by: gcanlin <canlinguosdu@gmail.com> Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
### What this PR does / why we need it? Part of vllm-project#5304. After vllm-project/vllm#32523 merge, we could remove the patch of `MiniCPMAttention`. ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? Test it locally. - vLLM version: v0.13.0 - vLLM main: vllm-project/vllm@2c24bc6 --------- Signed-off-by: gcanlin <canlinguosdu@gmail.com>
### What this PR does / why we need it? Part of vllm-project#5304. After vllm-project/vllm#32523 merge, we could remove the patch of `MiniCPMAttention`. ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? Test it locally. - vLLM version: v0.13.0 - vLLM main: vllm-project/vllm@2c24bc6 --------- Signed-off-by: gcanlin <canlinguosdu@gmail.com> Signed-off-by: zrj026 <zhangrunjiang026@gmail.com>
### What this PR does / why we need it? Part of vllm-project#5304. After vllm-project/vllm#32523 merge, we could remove the patch of `MiniCPMAttention`. ### Does this PR introduce _any_ user-facing change? ### How was this patch tested? Test it locally. - vLLM version: v0.13.0 - vLLM main: vllm-project/vllm@2c24bc6 --------- Signed-off-by: gcanlin <canlinguosdu@gmail.com>
What this PR does / why we need it?
Part of #5304.
After vllm-project/vllm#32523 merge, we could remove the patch of
MiniCPMAttention.Does this PR introduce any user-facing change?
How was this patch tested?
Test it locally.