Skip to content

Conversation

@luis5tb
Copy link
Contributor

@luis5tb luis5tb commented Sep 24, 2025

Ensure harmony utils can also handle arrays for tool message content

This fixes the issue reported at #25001

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors.

You ask your reviewers to trigger select CI tests on top of fastcheck CI.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

If you have any questions, please reach out to us on Slack at https://slack.vllm.ai.

🚀

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request aims to add support for array content in tool messages. The current implementation is partial, only handling the first element of a text array, which can lead to data loss and potential runtime errors for other array types. I've suggested a more robust implementation to correctly process all text parts in an array and ensure the content is always a string, preventing crashes and preserving all information.

@bbrowning
Copy link
Contributor

This directionally seems reasonable and looks like it would fix the issue of vLLM rejecting valid tool responses when they're in a list of content vs direct string. The Chat Completions API spec does not technically restrict the array of text content parts in the content of this tools role to only 1, so the Gemini code suggestion about combining all the content parts instead of only taking the 1st may be worth considering.

For the failing ruff pre-commit hook, see https://docs.vllm.ai/en/latest/contributing/#linting for how to run these linting checks locally.

@luis5tb luis5tb force-pushed the tool-message-array branch 2 times, most recently from 83482f0 to da5e8eb Compare September 25, 2025 05:39
Ensure harmony utils can also handle arrays for tool message content

Signed-off-by: Luis Tomas Bolivar <[email protected]>
@bbrowning
Copy link
Contributor

The latest version of this looks good to me, with tests added for tool message content as a string, as a one-element list, and as a multi-element list. Thanks for helping find and fix these tool calling gaps!

@luis5tb luis5tb requested a review from chaunceyjiang October 1, 2025 10:11
Copy link

@tisnik tisnik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it looks perfectly sane, thank you

Copy link
Collaborator

@NickLucche NickLucche left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@github-project-automation github-project-automation bot moved this from To Triage to Ready in gpt-oss Issues & Enhancements Oct 10, 2025
@NickLucche NickLucche enabled auto-merge (squash) October 10, 2025 07:34
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Oct 10, 2025
@NickLucche NickLucche merged commit 3ee202e into vllm-project:main Oct 10, 2025
53 checks passed
@chaunceyjiang chaunceyjiang mentioned this pull request Oct 10, 2025
5 tasks
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
Dhruvilbhatt pushed a commit to Dhruvilbhatt/vllm that referenced this pull request Oct 14, 2025
bbartels pushed a commit to bbartels/vllm that referenced this pull request Oct 16, 2025
lywa1998 pushed a commit to lywa1998/vllm that referenced this pull request Oct 20, 2025
alhridoy pushed a commit to alhridoy/vllm that referenced this pull request Oct 24, 2025
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 24, 2025
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
0xrushi pushed a commit to 0xrushi/vllm that referenced this pull request Oct 26, 2025
rtourgeman pushed a commit to rtourgeman/vllm that referenced this pull request Nov 10, 2025
devpatelio pushed a commit to SumanthRH/vllm that referenced this pull request Nov 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

frontend gpt-oss Related to GPT-OSS models ready ONLY add when PR is ready to merge/full CI is needed

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

5 participants