Skip to content

[ROCm][CI] Pinning lm-eval version to resolve multi-modal small eval bug#34038

Merged
vllm-bot merged 1 commit intovllm-project:mainfrom
ROCm:akaratza_pin_lm_eval
Feb 7, 2026
Merged

[ROCm][CI] Pinning lm-eval version to resolve multi-modal small eval bug#34038
vllm-bot merged 1 commit intovllm-project:mainfrom
ROCm:akaratza_pin_lm_eval

Conversation

@AndreasKaratzas
Copy link
Collaborator

@AndreasKaratzas AndreasKaratzas commented Feb 7, 2026

Pins lm-eval[api] to ==0.4.9.2 in test requirements to fix a breaking incompatibility with the vLLM VLM evaluation backend.

Problem

The lm-eval-harness CI test for Qwen2.5-VL-7B-Instruct (and potentially other VLM configs) fails with:

TypeError: vllm.sampling_params.SamplingParams() argument after ** must be a mapping, not tuple

This occurs in lm_eval/models/vllm_vlms.py:299 inside _multimodal_model_generate. A newer version of lm-eval changed the return signature of modify_gen_kwargs to return a (kwargs_dict, stop_list, max_gen_toks) tuple, but the VLM backend still attempts to unpack the result as **kwargs into SamplingParams(...).

Signed-off-by: Andreas Karatzas <akaratza@amd.com>
@mergify mergify bot added ci/build rocm Related to AMD ROCm bug Something isn't working labels Feb 7, 2026
@github-project-automation github-project-automation bot moved this to Todo in AMD Feb 7, 2026
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request pins the lm-eval dependency to version 0.4.9.2 in requirements/rocm-test.txt. This is a necessary change to fix a breaking incompatibility introduced in a newer version of lm-eval, which was causing CI failures. The fix is correct and improves the stability of the test environment. The change is approved.

@DarkLight1337
Copy link
Member

Unfortunately this is incompatible with transformers v5 (#33994), but let's fix CI first.

@vllm-bot vllm-bot merged commit c490d8c into vllm-project:main Feb 7, 2026
9 of 12 checks passed
@github-project-automation github-project-automation bot moved this from Todo to Done in AMD Feb 7, 2026
@AndreasKaratzas AndreasKaratzas deleted the akaratza_pin_lm_eval branch February 7, 2026 07:35
ItzDEXX pushed a commit to ItzDEXX/vllm that referenced this pull request Feb 19, 2026
…bug (vllm-project#34038)

Signed-off-by: Andreas Karatzas <akaratza@amd.com>
tunglinwood pushed a commit to tunglinwood/vllm that referenced this pull request Mar 4, 2026
…bug (vllm-project#34038)

Signed-off-by: Andreas Karatzas <akaratza@amd.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working ci/build rocm Related to AMD ROCm

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

3 participants