Skip to content

[Bugfix] Fix ResponseCreatedEvent ValidationError for json_schema format in streaming#34611

Open
agangadi24 wants to merge 1 commit intovllm-project:mainfrom
agangadi24:fix/response-created-event-json-schema-validation
Open

[Bugfix] Fix ResponseCreatedEvent ValidationError for json_schema format in streaming#34611
agangadi24 wants to merge 1 commit intovllm-project:mainfrom
agangadi24:fix/response-created-event-json-schema-validation

Conversation

@agangadi24
Copy link
Copy Markdown

Summary

  • Remove .model_dump() from ResponsesResponse before passing to ResponseCreatedEvent/ResponseInProgressEvent during streaming
  • .model_dump() serialized using Python attribute names (e.g. schema_) but Pydantic re-validation expected JSON alias names (e.g. schema), causing a ValidationError when text.format.type: json_schema was used with stream=True
  • This makes event construction consistent with ResponseCompletedEvent, which already passes the ResponsesResponse object directly

Test plan

  • Existing structured output tests pass: pytest tests/v1/entrypoints/openai/serving_responses/test_structured_output.py -v
  • Existing streaming tests pass: pytest tests/v1/entrypoints/openai/serving_responses/test_basic.py -v
  • New test_structured_output_streaming covers the exact failing code path (json_schema + streaming)
  • Unit tests pass: pytest tests/entrypoints/openai/test_serving_responses.py -v

@github-actions
Copy link
Copy Markdown

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors.

You ask your reviewers to trigger select CI tests on top of fastcheck CI.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

If you have any questions, please reach out to us on Slack at https://slack.vllm.ai.

🚀

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly fixes a ValidationError that occurred during streaming with json_schema format. The fix, which removes the .model_dump() call when creating ResponseCreatedEvent and ResponseInProgressEvent, is appropriate and aligns the behavior with ResponseCompletedEvent. The new regression test, test_structured_output_streaming, effectively covers the specific failing scenario. The changes are well-executed and address the issue properly.

@agangadi24 agangadi24 force-pushed the fix/response-created-event-json-schema-validation branch from e185ce0 to dab8ee7 Compare February 16, 2026 09:13
@mergify
Copy link
Copy Markdown

mergify bot commented Feb 16, 2026

Hi @agangadi24, the pre-commit checks have failed. Please run:

uv pip install pre-commit
pre-commit install
pre-commit run --all-files

Then, commit the changes and push to your branch.

For future commits, pre-commit will run automatically on changed files before each commit.

Tip

Is mypy or markdownlint failing?
mypy and markdownlint are run differently in CI. If the failure is related to either of these checks, please use the following commands to run them locally:
# For mypy (substitute "3.10" with the failing version if needed)
pre-commit run --hook-stage manual mypy-3.10
# For markdownlint
pre-commit run --hook-stage manual markdownlint

@agangadi24 agangadi24 force-pushed the fix/response-created-event-json-schema-validation branch 2 times, most recently from e9b2ccf to 24a6a7d Compare February 16, 2026 17:58
…mat in streaming

Remove `.model_dump()` from `ResponsesResponse` before passing to
`ResponseCreatedEvent` and `ResponseInProgressEvent`. The `.model_dump()`
serialized using Python attribute names (e.g. `schema_`) but Pydantic
re-validation expected JSON alias names (e.g. `schema`), causing a
ValidationError when `text.format.type: json_schema` was used with streaming.

This makes `ResponseCreatedEvent`/`ResponseInProgressEvent` construction
consistent with `ResponseCompletedEvent`, which already passes the object
directly.

Signed-off-by: Anusha Gangadi <agangadi@nvidia.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working frontend v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant