[BugFix] Async scheduling: handle model forward errors more cleanly#31611
Merged
vllm-bot merged 1 commit intovllm-project:mainfrom Jan 4, 2026
Merged
[BugFix] Async scheduling: handle model forward errors more cleanly#31611vllm-bot merged 1 commit intovllm-project:mainfrom
vllm-bot merged 1 commit intovllm-project:mainfrom
Conversation
If the model runner execute_model() raises an exception, it will be logged via a future callback, but the core loop will subsequently fail with a misleading exception since sample_tokens() will return None: `AttributeError: 'NoneType' object has no attribute 'sampled_token_ids'` This PR changes the step_with_batch_queue() method to instead raise the root cause exception from the execute_model() future inline in this case, which also removes the need for the error callback. Signed-off-by: njhill <nickhill123@gmail.com>
Contributor
There was a problem hiding this comment.
Code Review
This pull request refactors the error handling in asynchronous scheduling to propagate the root cause exception from execute_model() failures, which is a good improvement. However, the implementation introduces a critical bug where a successful execution path can lead to a RuntimeError. My review includes a comment with a suggested fix for this issue.
ohsono
added a commit
to ohsono/vllm
that referenced
this pull request
Jan 2, 2026
Signed-off-by: Hochan Son <ohsono@gmail.com>
Member
Author
|
CI failure is unrelated |
mgoin
approved these changes
Jan 4, 2026
LucasWilkinson
pushed a commit
to neuralmagic/vllm
that referenced
this pull request
Jan 6, 2026
…llm-project#31611) Signed-off-by: njhill <nickhill123@gmail.com>
yugong333
pushed a commit
to yugong333/vllm
that referenced
this pull request
Jan 9, 2026
…llm-project#31611) Signed-off-by: njhill <nickhill123@gmail.com>
akh64bit
pushed a commit
to akh64bit/vllm
that referenced
this pull request
Jan 16, 2026
…llm-project#31611) Signed-off-by: njhill <nickhill123@gmail.com>
dsuhinin
pushed a commit
to dsuhinin/vllm
that referenced
this pull request
Jan 21, 2026
…llm-project#31611) Signed-off-by: njhill <nickhill123@gmail.com> Signed-off-by: dsuhinin <suhinin.dmitriy@gmail.com>
ItzDEXX
pushed a commit
to ItzDEXX/vllm
that referenced
this pull request
Feb 19, 2026
…llm-project#31611) Signed-off-by: njhill <nickhill123@gmail.com>
ohsono
added a commit
to ohsono/vllm
that referenced
this pull request
Feb 25, 2026
Signed-off-by: Hochan Son <ohsono@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
If the model runner
execute_model()method raises an exception, it will be logged via a future callback, but the core loop will subsequently fail with a misleading secondary exception sincesample_tokens()will returnNone:This PR changes the
step_with_batch_queue()method in the core loop to instead raise the root cause exception from theexecute_model()future inline in this case, which also removes the need for the error callback.