Skip to content

Fix custom processors that use deleted import for Transformers v5#35101

Merged
vllm-bot merged 2 commits intovllm-project:mainfrom
hmellor:v5-fix-hyperclovax
Feb 23, 2026
Merged

Fix custom processors that use deleted import for Transformers v5#35101
vllm-bot merged 2 commits intovllm-project:mainfrom
hmellor:v5-fix-hyperclovax

Conversation

@hmellor
Copy link
Member

@hmellor hmellor commented Feb 23, 2026

Some remote code processors still import ChatTemplateLoadKwargs which was a subset of ProcessorChatTemplateKwargs as defined in Transformers v4. In Transformers v5 these were merged into ProcessorChatTemplateKwargs and ChatTemplateLoadKwargs was removed.

In vLLM CI, one architecture which does this is HCXVisionForCausalLM. If we upstream this architecture to Transformers, we can remove this backward compatibility patch in vLLM.

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request introduces a backward compatibility fix for HCXVisionForCausalLM to ensure its functionality with Transformers v5. This is achieved by aliasing a removed keyword argument. The changes are focused and address the compatibility issue effectively.

old = getattr(processing_utils, "ChatTemplateLoadKwargs", None)
new = getattr(processing_utils, "ProcessorChatTemplateKwargs", None)
if old is None and new is not None:
processing_utils.ChatTemplateLoadKwargs = new
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Directly modifying a third-party library's module (processing_utils.ChatTemplateLoadKwargs = new) can be fragile and lead to unexpected behavior or breakage with future updates of the transformers library. While this is noted as a temporary solution, it introduces a dependency on the internal structure of transformers that might not be stable. Consider if there's an alternative way to achieve this compatibility without directly patching the module, or ensure this patch is rigorously tested against future transformers versions.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is unfortunately necessary

@hmellor hmellor changed the title Fix HCXVisionForCausalLM for Transformers v5 Fix custom processors that use deleted import for Transformers v5 Feb 23, 2026
@DarkLight1337 DarkLight1337 enabled auto-merge (squash) February 23, 2026 14:25
@hmellor hmellor added the ready ONLY add when PR is ready to merge/full CI is needed label Feb 23, 2026
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@vllm-bot vllm-bot merged commit 864167d into vllm-project:main Feb 23, 2026
45 of 47 checks passed
@hmellor hmellor deleted the v5-fix-hyperclovax branch February 23, 2026 16:44
llsj14 pushed a commit to llsj14/vllm that referenced this pull request Mar 1, 2026
…lm-project#35101)

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
tunglinwood pushed a commit to tunglinwood/vllm that referenced this pull request Mar 4, 2026
…lm-project#35101)

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants