[Bugfix]: Fix TokenizerLike interface#30009
Merged
vllm-bot merged 8 commits intovllm-project:mainfrom Dec 6, 2025
Merged
Conversation
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
Contributor
There was a problem hiding this comment.
Code Review
This pull request fixes a crash that occurs when using MistralTokenizer in benchmarks by adding the missing num_special_tokens_to_add method. My review focuses on making the implementation of this new method more robust and maintainable. I've suggested deriving the return value dynamically rather than hardcoding it, which will prevent silent bugs if the tokenizer's encoding behavior changes in the future.
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
Rohan138
commented
Dec 4, 2025
Rohan138
commented
Dec 4, 2025
Rohan138
commented
Dec 4, 2025
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
Rohan138
commented
Dec 4, 2025
Rohan138
commented
Dec 4, 2025
Rohan138
commented
Dec 4, 2025
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
num_special_tokens_to_add to MistralTokenizernum_special_tokens_to_add to MistralTokenizer, update PretrainedTokenizerBase -> TokenizerLike
num_special_tokens_to_add to MistralTokenizer, update PretrainedTokenizerBase -> TokenizerLikeTokenizerLike interface
Member
|
Please fix pre-commit |
4 tasks
|
Hi @Rohan138, the pre-commit checks have failed. Please run: uv pip install pre-commit
pre-commit install
pre-commit run --all-filesThen, commit the changes and push to your branch. For future commits, |
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com>
9e6e449 to
6f67bc5
Compare
DarkLight1337
approved these changes
Dec 6, 2025
dsuhinin
pushed a commit
to dsuhinin/vllm
that referenced
this pull request
Jan 21, 2026
Signed-off-by: Rohan138 <rohanpotdar138@gmail.com> Signed-off-by: dsuhinin <suhinin.dmitriy@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Purpose
Corrected+rebased version of #22121 to fix #22013. Note that this function is only called from
RandomDataset.sample, which callstokenizer.encode(prompt, add_special_tokens=False)(https://github.com/vllm-project/vllm/blob/main/vllm/benchmarks/datasets.py#L407). Hence, the tokenizer only adds the bos token; not eos,<INST>,</INST>.Changes:
num_special_tokens_to_addto MistralTokenizerTest Plan
Currently, with
vllm/vllm-openai:nightly:Test Result
With this PR:
Minimal example of the bos token being added to an empty prompt:
Note that before #29693, vLLM would incorrectly default to the HF tokenizer even if mistral_common was installed, which hid the issue unless you explicitly specified
--tokenizer-mode mistralduring benchmarking. However, for recent nightly builds, this breaks since the tokenizer correctly defaults to the MistralTokenizer backend when using--tokenizer-mode auto.Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.