Fix custom processors that use deleted import for Transformers v5#35101
Fix custom processors that use deleted import for Transformers v5#35101vllm-bot merged 2 commits intovllm-project:mainfrom
Conversation
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
There was a problem hiding this comment.
Code Review
The pull request introduces a backward compatibility fix for HCXVisionForCausalLM to ensure its functionality with Transformers v5. This is achieved by aliasing a removed keyword argument. The changes are focused and address the compatibility issue effectively.
vllm/transformers_utils/processor.py
Outdated
| old = getattr(processing_utils, "ChatTemplateLoadKwargs", None) | ||
| new = getattr(processing_utils, "ProcessorChatTemplateKwargs", None) | ||
| if old is None and new is not None: | ||
| processing_utils.ChatTemplateLoadKwargs = new |
There was a problem hiding this comment.
Directly modifying a third-party library's module (processing_utils.ChatTemplateLoadKwargs = new) can be fragile and lead to unexpected behavior or breakage with future updates of the transformers library. While this is noted as a temporary solution, it introduces a dependency on the internal structure of transformers that might not be stable. Consider if there's an alternative way to achieve this compatibility without directly patching the module, or ensure this patch is rigorously tested against future transformers versions.
There was a problem hiding this comment.
This is unfortunately necessary
HCXVisionForCausalLM for Transformers v5Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
…lm-project#35101) Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
…lm-project#35101) Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Some remote code processors still import
ChatTemplateLoadKwargswhich was a subset ofProcessorChatTemplateKwargsas defined in Transformers v4. In Transformers v5 these were merged intoProcessorChatTemplateKwargsandChatTemplateLoadKwargswas removed.In vLLM CI, one architecture which does this is
HCXVisionForCausalLM. If we upstream this architecture to Transformers, we can remove this backward compatibility patch in vLLM.