Skip to content

Fix call to modify_gen_kwargs in vllm_vlms.py#3573

Merged
baberabb merged 2 commits intoEleutherAI:mainfrom
hmellor:fix-vllm
Feb 11, 2026
Merged

Fix call to modify_gen_kwargs in vllm_vlms.py#3573
baberabb merged 2 commits intoEleutherAI:mainfrom
hmellor:fix-vllm

Conversation

@hmellor
Copy link
Copy Markdown
Contributor

@hmellor hmellor commented Feb 10, 2026

#3509 updated the return spec of VLLM.modify_gen_kwargs to return tuple[dict, list[str], int] without updating a call location in the VLLM_VLM(VLLM) class.

This PR updates the call location so that the last two elements of the tuple are ignored.

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@hmellor hmellor requested a review from baberabb as a code owner February 10, 2026 14:04
@CLAassistant
Copy link
Copy Markdown

CLAassistant commented Feb 10, 2026

CLA assistant check
All committers have signed the CLA.

@baberabb
Copy link
Copy Markdown
Contributor

LGTM! Thanks for the PR

@baberabb baberabb merged commit b177b36 into EleutherAI:main Feb 11, 2026
6 checks passed
@hmellor hmellor deleted the fix-vllm branch February 11, 2026 09:05
@hmellor
Copy link
Copy Markdown
Contributor Author

hmellor commented Feb 11, 2026

Thanks for the quick merge! If possible a patch release would be greatly appreciated. This PR was made as part of the effort to upgrade vLLM to Transformers v5.

@baberabb
Copy link
Copy Markdown
Contributor

baberabb commented Feb 11, 2026

Thanks for the quick merge! If possible a patch release would be greatly appreciated. This PR was made as part of the effort to upgrade vLLM to Transformers v5.

Will try by friday, unless it's very urgent? also pushed #3582. The changes are mostly type fixes, but forgot to mention the normalize_gen_kwargs utility we added recently.

@hmellor
Copy link
Copy Markdown
Contributor Author

hmellor commented Feb 11, 2026

Friday is good for me. I just want to make sure that we can bump lm-eval in vLLM before I bump transformers and there are a few more transformers upgrade related issues to work through.

Tracin pushed a commit to Tracin/lm-evaluation-harness that referenced this pull request Mar 11, 2026
* Fix call to `modify_gen_kwargs` in `vllm_vlms.py`

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>

* pacify pre-commit

---------

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Co-authored-by: Baber <baber@hey.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants