[ROCm][CI] Pinning lm-eval version to resolve multi-modal small eval bug#34038
Merged
vllm-bot merged 1 commit intovllm-project:mainfrom Feb 7, 2026
Merged
[ROCm][CI] Pinning lm-eval version to resolve multi-modal small eval bug#34038vllm-bot merged 1 commit intovllm-project:mainfrom
vllm-bot merged 1 commit intovllm-project:mainfrom
Conversation
Signed-off-by: Andreas Karatzas <akaratza@amd.com>
Contributor
There was a problem hiding this comment.
Code Review
This pull request pins the lm-eval dependency to version 0.4.9.2 in requirements/rocm-test.txt. This is a necessary change to fix a breaking incompatibility introduced in a newer version of lm-eval, which was causing CI failures. The fix is correct and improves the stability of the test environment. The change is approved.
Member
|
Unfortunately this is incompatible with transformers v5 (#33994), but let's fix CI first. |
DarkLight1337
approved these changes
Feb 7, 2026
3 tasks
ItzDEXX
pushed a commit
to ItzDEXX/vllm
that referenced
this pull request
Feb 19, 2026
…bug (vllm-project#34038) Signed-off-by: Andreas Karatzas <akaratza@amd.com>
tunglinwood
pushed a commit
to tunglinwood/vllm
that referenced
this pull request
Mar 4, 2026
…bug (vllm-project#34038) Signed-off-by: Andreas Karatzas <akaratza@amd.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Pins
lm-eval[api]to==0.4.9.2in test requirements to fix a breaking incompatibility with the vLLM VLM evaluation backend.Problem
The
lm-eval-harnessCI test forQwen2.5-VL-7B-Instruct(and potentially other VLM configs) fails with:This occurs in
lm_eval/models/vllm_vlms.py:299inside_multimodal_model_generate. A newer version oflm-evalchanged the return signature ofmodify_gen_kwargsto return a(kwargs_dict, stop_list, max_gen_toks)tuple, but the VLM backend still attempts to unpack the result as**kwargsintoSamplingParams(...).