Skip to content

Conversation

dawidkulpa
Copy link

@dawidkulpa dawidkulpa commented Oct 12, 2025

Title

[Fix] Error from OpenAI Responses API with web_search

Relevant issues

When OpenAI Responses API is called with web_search tool, and the response contains those web_search calls, LiteLLM throws an error with 503 status:

litellm.ServiceUnavailableError: litellm.MidStreamFallbackError: litellm.APIConnectionError: APIConnectionError: OpenAIException - Error parsing chunk: Chat provider: Invalid output_item  {'id': 'ws_0e5386c91dd50f830068ebe883849081919a0164a2c5f5a7f5', 'type': 'web_search_call', 'status': 'in_progress'},
Received chunk: type=<ResponsesAPIStreamEvents.OUTPUT_ITEM_ADDED: 'response.output_item.added'> output_index=1 item={'id': 'ws_0e5386c91dd50f830068ebe883849081919a0164a2c5f5a7f5', 'type': 'web_search_call', 'status': 'in_progress'} sequence_number=563.

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🐛 Bug Fix

Changes

  • when transforming Responses API response to Completion API format, unsupported web_search calls are skipped now.

Tests

image

Before change:
image

After change:
image

Copy link

vercel bot commented Oct 12, 2025

@dawidkulpa is attempting to deploy a commit to the CLERKIEAI Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hey @dawidkulpa, shouldn't we be trying to return it to the user, instead of ignoring it?

Copy link
Author

@dawidkulpa dawidkulpa Oct 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey! Good question - I was debating it with myself as well. The issue is that would be a real hack, as Completions API just don't support giving statuses from these kind of calls. We could map it to something simple, but that would definitely be against what API provides.
I honestly would like that, but skipped it just because it's a hack and didn't want to bother you with explanations and potentially more difficult maintainability.
Just let me know what's your opinion, I can implement it :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants