Skip to content

[bugfix] Fix critical bug when reporting for all paths where handler.create_error_response is used#34516

Merged
vllm-bot merged 2 commits intovllm-project:mainfrom
kizill:fix_request_error_reporting
Feb 15, 2026
Merged

[bugfix] Fix critical bug when reporting for all paths where handler.create_error_response is used#34516
vllm-bot merged 2 commits intovllm-project:mainfrom
kizill:fix_request_error_reporting

Conversation

@kizill
Copy link
Contributor

@kizill kizill commented Feb 13, 2026

Purpose

Current versions (including v0.15.1 & v0.16.0) return http code 200 when error occures. That's critical for many applications.

Test Plan

Test Result

…(e) presents

Signed-off-by: Stanislav Kirillov <stas@nebius.com>
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request addresses a critical bug where API endpoints would return a 200 OK status code even when an exception occurred. The fix is consistently applied across all affected API routers by modifying the exception handling logic. Instead of returning an ErrorResponse object directly from the except block, the changes now assign it to a local variable. This allows the existing subsequent logic to correctly construct a JSONResponse with the appropriate error status code. The changes are correct and effectively resolve the issue.

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be better to update create_error_response to create a JSONResponse with the status code instead.

@kizill
Copy link
Contributor Author

kizill commented Feb 13, 2026

@DarkLight1337 I tried and found that it will break error treatment logic in many places as create_error_response is called not only when unhandled exception is found.

@DarkLight1337
Copy link
Member

Oh I didn't know that, @hmellor are you ok with this?

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's get this fix in (sorry I meant to ping @noooop )

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) February 14, 2026 15:12
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Feb 14, 2026
@DarkLight1337 DarkLight1337 added this to the v0.16.0 cherry picks milestone Feb 14, 2026
@vllm-bot vllm-bot merged commit 50dbd6c into vllm-project:main Feb 15, 2026
47 of 49 checks passed
athrael-soju pushed a commit to athrael-soju/vllm that referenced this pull request Feb 16, 2026
…create_error_response is used (vllm-project#34516)

Signed-off-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
Signed-off-by: athrael-soju <athrael-soju@users.noreply.github.com>
wzhao18 pushed a commit to wzhao18/vllm that referenced this pull request Feb 18, 2026
…create_error_response is used (vllm-project#34516)

Signed-off-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
Signed-off-by: wzhao18 <wzhao18.sz@gmail.com>
eldarkurtic pushed a commit to eldarkurtic/vllm that referenced this pull request Feb 19, 2026
…create_error_response is used (vllm-project#34516)

Signed-off-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
Signed-off-by: Eldar Kurtic <research@neuralmagic.com>
ZJY0516 pushed a commit to ZJY0516/vllm that referenced this pull request Feb 23, 2026
…create_error_response is used (vllm-project#34516)

Signed-off-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
Signed-off-by: zjy0516 <riverclouds.zhu@qq.com>
@ushaket
Copy link

ushaket commented Feb 25, 2026

Is there an issue for this PR? I'd like to reopen that if exist
I believe you missed vllm.entrypoints.openai.speech_to_text.speech_to_text in this PR

@DarkLight1337
Copy link
Member

#31164 should solve this in a more comprehensive manner, can you try it out?

@ushaket
Copy link

ushaket commented Feb 26, 2026

I'm afraid I moved on to work on something else, I accidentally ran into this issue and wanted to raise it, if I'll run into this issue again I'll report

llsj14 pushed a commit to llsj14/vllm that referenced this pull request Mar 1, 2026
…create_error_response is used (vllm-project#34516)

Signed-off-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
tunglinwood pushed a commit to tunglinwood/vllm that referenced this pull request Mar 4, 2026
…create_error_response is used (vllm-project#34516)

Signed-off-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Stanislav Kirillov <stas@nebius.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working frontend ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants