Skip to content

[Bugfix] accept redacted thinking blocks in Anthropic messages#36992

Merged
DarkLight1337 merged 4 commits intovllm-project:mainfrom
bbartels:benjamin/fix-anthropic-redacted-thinking
Mar 16, 2026
Merged

[Bugfix] accept redacted thinking blocks in Anthropic messages#36992
DarkLight1337 merged 4 commits intovllm-project:mainfrom
bbartels:benjamin/fix-anthropic-redacted-thinking

Conversation

@bbartels
Copy link
Contributor

@bbartels bbartels commented Mar 13, 2026

Purpose

  • Accept Anthropic redacted_thinking content blocks in echoed assistant messages so follow-up /v1/messages requests do not fail validation.
  • When a previous claude session contains redacted thinking blocks (from targeting anthropics models) and sending it off to vllm, validation fails.
  • Ignore opaque redacted reasoning during Anthropic-to-OpenAI request conversion while preserving normal thinking -> reasoning handling.
  • Add regression coverage for assistant thinking-block conversion, including multi-turn replay and redacted_thinking inputs.
  • AI assistance: prepared with OpenCode (GPT-5.4).

Test Plan

  • pytest tests/entrypoints/openai/test_anthropic_messages_conversion.py -q

Test Result

  • Not run in this environment because pytest is not installed (/bin/bash: line 1: pytest: command not found).

Allow Anthropic clients to replay assistant turns that include redacted thinking blocks without failing request validation, and keep opaque redacted reasoning out of the converted OpenAI prompt.\n\nGenerated-by: OpenCode (GPT-5.4)

Signed-off-by: Benjamin Bartels <benjaminba@tiglab-ubuntu.ilab.local>
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces changes to handle redacted_thinking content blocks from Anthropic. The AnthropicContentBlock protocol is updated to accept redacted_thinking as a valid type, and the corresponding request conversion logic is modified to ignore these blocks, preventing validation errors. New regression tests are added to verify the handling of thinking blocks, including redacted_thinking, in assistant messages during multi-turn conversations. The changes are localized to the Anthropic protocol and serving logic, with corresponding test coverage.

Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix the merge conflicts though

@mergify
Copy link

mergify bot commented Mar 14, 2026

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @bbartels.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Mar 14, 2026
@mergify mergify bot removed the needs-rebase label Mar 14, 2026
@bbartels
Copy link
Contributor Author

@DarkLight1337 Fixed merge conflicts!

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) March 14, 2026 14:22
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Mar 14, 2026
auto-merge was automatically disabled March 15, 2026 16:56

Head branch was pushed to by a user without write access

Signed-off-by: bbartels <benjamin@bartels.dev>
@bbartels bbartels force-pushed the benjamin/fix-anthropic-redacted-thinking branch from 7c12374 to b45f105 Compare March 15, 2026 16:57
@bbartels
Copy link
Contributor Author

@DarkLight1337 fixed tests, all passing now

@DarkLight1337 DarkLight1337 merged commit 0e5a938 into vllm-project:main Mar 16, 2026
46 checks passed
Lucaskabela pushed a commit to Lucaskabela/vllm that referenced this pull request Mar 17, 2026
…project#36992)

Signed-off-by: Benjamin Bartels <benjaminba@tiglab-ubuntu.ilab.local>
Signed-off-by: bbartels <benjamin@bartels.dev>
Co-authored-by: Benjamin Bartels <benjaminba@tiglab-ubuntu.ilab.local>
wendyliu235 pushed a commit to wendyliu235/vllm-public that referenced this pull request Mar 18, 2026
…project#36992)

Signed-off-by: Benjamin Bartels <benjaminba@tiglab-ubuntu.ilab.local>
Signed-off-by: bbartels <benjamin@bartels.dev>
Co-authored-by: Benjamin Bartels <benjaminba@tiglab-ubuntu.ilab.local>
fxdawnn pushed a commit to fxdawnn/vllm that referenced this pull request Mar 19, 2026
…project#36992)

Signed-off-by: Benjamin Bartels <benjaminba@tiglab-ubuntu.ilab.local>
Signed-off-by: bbartels <benjamin@bartels.dev>
Co-authored-by: Benjamin Bartels <benjaminba@tiglab-ubuntu.ilab.local>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working frontend ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants