Skip to content

[Bugfix] Fix models and tests for transformers v5#33977

Merged
DarkLight1337 merged 8 commits intovllm-project:mainfrom
zucchini-nlp:v5
Feb 6, 2026
Merged

[Bugfix] Fix models and tests for transformers v5#33977
DarkLight1337 merged 8 commits intovllm-project:mainfrom
zucchini-nlp:v5

Conversation

@zucchini-nlp
Copy link
Contributor

@zucchini-nlp zucchini-nlp commented Feb 6, 2026

As per title, related to #30566. All changes are v4 compatible as well, should be compatible

@hmellor

Signed-off-by: raushan <raushan@huggingface.co>
Signed-off-by: raushan <raushan@huggingface.co>
@mergify mergify bot added multi-modality Related to multi-modality (#4194) qwen Related to Qwen models bug Something isn't working labels Feb 6, 2026
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the codebase to be compatible with transformers v5. The changes primarily involve adapting models and processors to new APIs, including variable renames, function signature modifications, and adjustments to attribute access paths. Most of these changes are straightforward and necessary for the upgrade. However, I've identified a critical issue in vllm/transformers_utils/processors/ovis2_5.py where the removal of return_tensors="np" from a preprocess call is likely to cause a runtime error. A fix has been suggested for this issue. The other changes appear correct and well-implemented.

@@ -692,7 +698,7 @@ class HunYuanVLDummyInputsBuilder(BaseDummyInputsBuilder[HunYuanVLProcessingInfo
def get_dummy_text(self, mm_counts: Mapping[str, int]) -> str:
num_images = mm_counts.get("image", 0)

hf_processor = self.info.get_hf_processor()
hf_processor = self.info.get_hf_processor(typ=HunYuanVLProcessor)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
hf_processor = self.info.get_hf_processor(typ=HunYuanVLProcessor)
hf_processor = self.info.get_hf_processor(HunYuanVLProcessor)

Prefer passing by positional args

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test failed when positional arg was used

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's weird but ok

Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
Signed-off-by: Raushan Turganbay <raushan.turganbay@alumni.nu.edu.kz>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@mergify
Copy link

mergify bot commented Feb 6, 2026

Documentation preview: https://vllm--33977.org.readthedocs.build/en/33977/

@mergify mergify bot added the documentation Improvements or additions to documentation label Feb 6, 2026
@mergify
Copy link

mergify bot commented Feb 6, 2026

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @zucchini-nlp.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Feb 6, 2026
Signed-off-by: raushan <raushan@huggingface.co>
@mergify mergify bot removed the needs-rebase label Feb 6, 2026
hmellor and others added 2 commits February 6, 2026 10:17
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: raushan <raushan@huggingface.co>
@hmellor hmellor enabled auto-merge (squash) February 6, 2026 10:26
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Feb 6, 2026
Signed-off-by: raushan <raushan@huggingface.co>
auto-merge was automatically disabled February 6, 2026 11:41

Head branch was pushed to by a user without write access

@DarkLight1337 DarkLight1337 merged commit 85ee1d9 into vllm-project:main Feb 6, 2026
53 of 54 checks passed
ItzDEXX pushed a commit to ItzDEXX/vllm that referenced this pull request Feb 19, 2026
Signed-off-by: raushan <raushan@huggingface.co>
Signed-off-by: Raushan Turganbay <raushan.turganbay@alumni.nu.edu.kz>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
Co-authored-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
tunglinwood pushed a commit to tunglinwood/vllm that referenced this pull request Mar 4, 2026
Signed-off-by: raushan <raushan@huggingface.co>
Signed-off-by: Raushan Turganbay <raushan.turganbay@alumni.nu.edu.kz>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
Co-authored-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working documentation Improvements or additions to documentation multi-modality Related to multi-modality (#4194) qwen Related to Qwen models ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants