[Model] Add Support for Grok2#24286
Closed
wenchen76 wants to merge 2 commits intovllm-project:mainfrom
Closed
Conversation
Contributor
|
This pull request has merge conflicts that must be resolved before it can be |
Contributor
|
This pull request has merge conflicts that must be resolved before it can be |
vermavis
pushed a commit
to vermavis/vllm
that referenced
this pull request
Sep 19, 2025
Contributor
|
Documentation preview: https://vllm--24286.org.readthedocs.build/en/24286/ |
Contributor
|
Hey there, I'm curious on the status of this PR/if this description is still accurate? |
Member
|
Closing this PR as Grok-2 support has been fully implemented and merged via #31847 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Purpose
To address #23557
Test Plan
Test Result
This is a draft PR since the work is still in progress and the implementation currently produces incorrect results.
Tokenizer support:
tokenizer.tok.jsonis currently not supported. As a workaround, you can use the Hugging Face–compatible tokenizer available here:FlashAttention issue:
I encountered the following error when testing:
To work around this, I used flashinfer. I’m not sure if this error is specific to my environment.
The generated responses are still incorrect, I haven’t had the chance to fully debug this yet.
@igor-susic1 @BranZhai @Crucifixion-Fxl — It would be great if you could take a look and help out whenever you get the chance. Thanks so much! 🙏
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.