[CI] Upgrade vLLM to 20250919 (6d8246aa) and fix some broken issue#2907
[CI] Upgrade vLLM to 20250919 (6d8246aa) and fix some broken issue#2907Yikun merged 37 commits intovllm-project:mainfrom
Conversation
|
👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:
If CI fails, you can run linting and testing checks locally according Contributing and Testing. |
There was a problem hiding this comment.
Code Review
This pull request appears to fix CI issues by adapting the code to a newer version of vLLM, particularly around multi-modal input handling. The changes introduce version-specific logic to maintain backward compatibility. My review focuses on improving the maintainability of this new logic by reducing code duplication and fixing a potential bug. I've identified two areas where helper functions can be used to create a single, unified implementation for different vLLM versions, which is a pattern already used effectively elsewhere in the changed files.
Codecov Report❌ Patch coverage is Additional details and impacted files@@ Coverage Diff @@
## main #2907 +/- ##
==========================================
- Coverage 74.76% 71.95% -2.82%
==========================================
Files 150 168 +18
Lines 20891 23547 +2656
==========================================
+ Hits 15620 16943 +1323
- Misses 5271 6604 +1333
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
This pull request has conflicts, please resolve those before we can evaluate the pull request. |
| strategy: | ||
| matrix: | ||
| vllm_version: [v0.10.2] | ||
| vllm_version: [main, v0.10.2] |
There was a problem hiding this comment.
My mean was using the latest hash in here: vllm-project/vllm@68dbde5
| vllm_version: [main, v0.10.2] | |
| vllm_version: [68dbde5, v0.10.2] |
Bump and address upstream change per day,
- pros: this move will shift from being reactive to proactive to avoid community level ci error.
- cons:
- maintainers should be carefully to review especially some code line
- we need upgrade main pin hash manually.
There was a problem hiding this comment.
There was a problem hiding this comment.
ok, by the way, we must use the full commit hash like 68dbde5dbb11b9250454d0c9f21a8b3da960b341, otherwise the checkout@v4 will failed, I have fall into the pit
|
Nit: I have an auto workflow to help bump: will submit in this PR or next time, any comments or suggestions are welcome name: Bump vllm latest commit hash for CI
on:
schedule:
- cron: '0 16 * * *' # At UTC+8 24:00 every day
workflow_dispatch:
jobs:
bumper:
name: Bump vllm latest commit hash for CI
runs-on: ubuntu-latest
steps:
- name: Checkout vllm
uses: actions/checkout@v4
with:
repository: vllm-project/vllm
- name: Get latest commit hash
id: get_hash
run: echo "commit_hash=$(git rev-parse HEAD)" >> $GITHUB_OUTPUT
outputs:
commit_hash: ${{ steps.get_hash.outputs.commit_hash }}
create_pr:
runs-on: ubuntu-latest
needs: bumper
env:
UPSTREAM_REPO: vllm-project/vllm-ascend
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
repository: vllm-ascend-ci/vllm-ascend
token: ${{ secrets.PAT_TOKEN }}
ref: main
- name: Add upstream remote
run: |
git remote add upstream https://github.com/${{ env.UPSTREAM_REPO }}.git
git fetch upstream
git remote -v
- name: Set Git user info dynamically
run: |
git config user.name "${{ github.actor }}"
git config user.email "${{ github.actor }}@users.noreply.github.com"
- name: Create or switch to branch
run: |
TIMESTAMP=$(date +%Y%m%d%H%M%S)
BRANCH_NAME="auto-pr/Bumper-${TIMESTAMP}"
echo "BRANCH_NAME=${BRANCH_NAME}" >> $GITHUB_ENV
git checkout -B "${BRANCH_NAME}" upstream/main
- name: add vllm commit hash to vllm_ascend_test.yaml
env:
GITHUB_TOKEN: ${{ secrets.PAT_TOKEN }}
run: |
git add ./vllm_ascend_test.yaml
git commit -s -m "[CI] Bump vllm commit hash to ${{ needs.bumper.outputs.commit_hash }}"
git push -f origin "${{ env.BRANCH_NAME }}"
- name: Create PR in upstream via API
uses: actions/github-script@v8
with:
github-token: ${{ secrets.PAT_TOKEN }}
script: |
const pr = await github.rest.pulls.create({
owner: 'vllm-project',
repo: 'vllm-ascend',
head: `vllm-ascend-ci:${{ env.BRANCH_NAME }}`,
base: 'main',
title: `[CI] Bump vllm commit hash to ${{ needs.bumper.outputs.commit_hash }}`,
body: `This PR bumps the vllm commit hash to ${{ needs.bumper.outputs.commit_hash }} for CI purposes.`,
});
console.log(`Created PR #${pr.data.number}`);
|
|
It seems we need to remove: vllm-ascend/.github/workflows/format_pr_body.yaml Lines 42 to 46 in af2a886 and pin env.VLLM_COMMIT to the static hash |
|
It seems the failed cases in CI is a known issue, let's skip it |
|
This pull request has conflicts, please resolve those before we can evaluate the pull request. |
Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: MengqingCao <cmq0113@163.com>
Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: wangli <wangli858794774@gmail.com>
Signed-off-by: wangli <wangli858794774@gmail.com>
Yikun
left a comment
There was a problem hiding this comment.
This patch only fix upstream interface, let's merge this to recover CI
Signed-off-by: wangli <wangli858794774@gmail.com>
…lm-project#2907) ### What this PR does / why we need it? 1. This pr bump vllm commit to vllm-project/vllm@6d8246a 2. fix upstream changes vllm-project/vllm#24548 abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable 3. fix metadata_builder changes introduced by vllm-project/vllm#23693 4. fix `structured_outputs_config` changes introduced by vllm-project/vllm#22772 5. fix `moe_config` changes introduced by vllm-project/vllm#22537 Co-authored-by: MengqingCao <cmq0113@163.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com> - vLLM version: v0.10.2 - vLLM main: vllm-project/vllm@c60e613 --------- Signed-off-by: wangli <wangli858794774@gmail.com> Signed-off-by: MengqingCao <cmq0113@163.com> Co-authored-by: MengqingCao <cmq0113@163.com>
…lm-project#2907) ### What this PR does / why we need it? 1. This pr bump vllm commit to vllm-project/vllm@6d8246a 2. fix upstream changes vllm-project/vllm#24548 abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable 3. fix metadata_builder changes introduced by vllm-project/vllm#23693 4. fix `structured_outputs_config` changes introduced by vllm-project/vllm#22772 5. fix `moe_config` changes introduced by vllm-project/vllm#22537 Co-authored-by: MengqingCao <cmq0113@163.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com> - vLLM version: v0.10.2 - vLLM main: vllm-project/vllm@c60e613 --------- Signed-off-by: wangli <wangli858794774@gmail.com> Signed-off-by: MengqingCao <cmq0113@163.com> Co-authored-by: MengqingCao <cmq0113@163.com> Signed-off-by: Che Ruan <cr623@ic.ac.uk>
…lm-project#2907) ### What this PR does / why we need it? 1. This pr bump vllm commit to vllm-project/vllm@6d8246a 2. fix upstream changes vllm-project/vllm#24548 abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable 3. fix metadata_builder changes introduced by vllm-project/vllm#23693 4. fix `structured_outputs_config` changes introduced by vllm-project/vllm#22772 5. fix `moe_config` changes introduced by vllm-project/vllm#22537 Co-authored-by: MengqingCao <cmq0113@163.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com> - vLLM version: v0.10.2 - vLLM main: vllm-project/vllm@c60e613 --------- Signed-off-by: wangli <wangli858794774@gmail.com> Signed-off-by: MengqingCao <cmq0113@163.com> Co-authored-by: MengqingCao <cmq0113@163.com> Signed-off-by: Che Ruan <cr623@ic.ac.uk>
…lm-project#2907) ### What this PR does / why we need it? 1. This pr bump vllm commit to vllm-project/vllm@6d8246a 2. fix upstream changes vllm-project/vllm#24548 abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable 3. fix metadata_builder changes introduced by vllm-project/vllm#23693 4. fix `structured_outputs_config` changes introduced by vllm-project/vllm#22772 5. fix `moe_config` changes introduced by vllm-project/vllm#22537 Co-authored-by: MengqingCao <cmq0113@163.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com> - vLLM version: v0.10.2 - vLLM main: vllm-project/vllm@c60e613 --------- Signed-off-by: wangli <wangli858794774@gmail.com> Signed-off-by: MengqingCao <cmq0113@163.com> Co-authored-by: MengqingCao <cmq0113@163.com>
…lm-project#2907) ### What this PR does / why we need it? 1. This pr bump vllm commit to vllm-project/vllm@6d8246a 2. fix upstream changes vllm-project/vllm#24548 abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable 3. fix metadata_builder changes introduced by vllm-project/vllm#23693 4. fix `structured_outputs_config` changes introduced by vllm-project/vllm#22772 5. fix `moe_config` changes introduced by vllm-project/vllm#22537 Co-authored-by: MengqingCao <cmq0113@163.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com> - vLLM version: v0.10.2 - vLLM main: vllm-project/vllm@c60e613 --------- Signed-off-by: wangli <wangli858794774@gmail.com> Signed-off-by: MengqingCao <cmq0113@163.com> Co-authored-by: MengqingCao <cmq0113@163.com> Signed-off-by: hwhaokun <haokun0405@163.com>
…lm-project#2907) ### What this PR does / why we need it? 1. This pr bump vllm commit to vllm-project/vllm@6d8246a 2. fix upstream changes vllm-project/vllm#24548 abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable 3. fix metadata_builder changes introduced by vllm-project/vllm#23693 4. fix `structured_outputs_config` changes introduced by vllm-project/vllm#22772 5. fix `moe_config` changes introduced by vllm-project/vllm#22537 Co-authored-by: MengqingCao <cmq0113@163.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com> - vLLM version: v0.10.2 - vLLM main: vllm-project/vllm@c60e613 --------- Signed-off-by: wangli <wangli858794774@gmail.com> Signed-off-by: MengqingCao <cmq0113@163.com> Co-authored-by: MengqingCao <cmq0113@163.com> Signed-off-by: nsdie <yeyifan@huawei.com>
…lm-project#2907) ### What this PR does / why we need it? 1. This pr bump vllm commit to vllm-project/vllm@6d8246a 2. fix upstream changes vllm-project/vllm#24548 abort multi-modal kwargs, make vllm main and `v0.10.2` both adaptable 3. fix metadata_builder changes introduced by vllm-project/vllm#23693 4. fix `structured_outputs_config` changes introduced by vllm-project/vllm#22772 5. fix `moe_config` changes introduced by vllm-project/vllm#22537 Co-authored-by: MengqingCao <cmq0113@163.com> Co-authored-by: Yikun Jiang <yikunkero@gmail.com> - vLLM version: v0.10.2 - vLLM main: vllm-project/vllm@c60e613 --------- Signed-off-by: wangli <wangli858794774@gmail.com> Signed-off-by: MengqingCao <cmq0113@163.com> Co-authored-by: MengqingCao <cmq0113@163.com>
What this PR does / why we need it?
v0.10.2both adaptablestructured_outputs_configchanges introduced by [Chore] Cleanup guided namespace, move to structured outputs config vllm#22772moe_configchanges introduced by [Kernel] Delegate construction of FusedMoEQuantConfig to FusedMoEMethodBase subclasses vllm#22537Co-authored-by: MengqingCao cmq0113@163.com
Co-authored-by: Yikun Jiang yikunkero@gmail.com