Skip to content

[CI/Build] Update LM Eval Version in AMD CI#27944

Merged
heheda12345 merged 1 commit intovllm-project:mainfrom
zhewenl:update-amd-lm-eval-ver
Nov 4, 2025
Merged

[CI/Build] Update LM Eval Version in AMD CI#27944
heheda12345 merged 1 commit intovllm-project:mainfrom
zhewenl:update-amd-lm-eval-ver

Conversation

@zhewenl
Copy link
Collaborator

@zhewenl zhewenl commented Nov 2, 2025

Purpose

This PR updates lm-eval deps used in the AMD CI to match the version used in Nvidia.

Test Plan

Before this fix, V1 Test others will complain about incompatible failure due to V0 deprecation:

/usr/local/lib/python3.12/dist-packages/lm_eval/models/vllm_causallms.py:268: TypeError
    outputs = self.model.generate(
        prompt_token_ids=requests,
        sampling_params=sampling_params,
        use_tqdm=True if self.batch_size == "auto" else False,
    )
E   TypeError: LLM.generate() got an unexpected keyword argument 'prompt_token_ids'

After the fix, the issue is gone: https://buildkite.com/vllm/amd-ci/builds/842#019a46a0-7764-4405-bd41-e50e6ceb0505

@mergify mergify bot added ci/build rocm Related to AMD ROCm labels Nov 2, 2025
Signed-off-by: zhewenli <zhewenli@meta.com>
@zhewenl zhewenl force-pushed the update-amd-lm-eval-ver branch from 130688f to 0557a9a Compare November 3, 2025 00:06
@zhewenl zhewenl marked this pull request as ready for review November 3, 2025 00:11
@zhewenl zhewenl requested a review from gshtras as a code owner November 3, 2025 00:11
mteb[bm25s]>=1.38.11, <2

# Required for eval tests
lm-eval[api] @ git+https://github.com/EleutherAI/lm-evaluation-harness.git@206b7722158f58c35b7ffcd53b035fdbdda5126d
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are you pinning a specific commit instead of a release version?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make sense

@heheda12345 heheda12345 enabled auto-merge (squash) November 4, 2025 05:35
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Nov 4, 2025
@heheda12345 heheda12345 merged commit 2f84ae1 into vllm-project:main Nov 4, 2025
19 checks passed
@zhewenl zhewenl deleted the update-amd-lm-eval-ver branch November 4, 2025 07:14
ZhengHongming888 pushed a commit to ZhengHongming888/vllm that referenced this pull request Nov 8, 2025
Signed-off-by: zhewenli <zhewenli@meta.com>
devpatelio pushed a commit to SumanthRH/vllm that referenced this pull request Nov 29, 2025
Signed-off-by: zhewenli <zhewenli@meta.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci/build ready ONLY add when PR is ready to merge/full CI is needed rocm Related to AMD ROCm

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants