-
Notifications
You must be signed in to change notification settings - Fork 2.5k
support for openai responses api #5783
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
they said sunsetting completions in august 2026, https://platform.openai.com/docs/deprecations/ |
|
#5694 lays the groundwork for a model database, which we could have a field to check which endpoint to use. |
|
@cbruyndoncx interesting - I guess they are getting push back. openrouter offers it via completions, as does everything else. Odd. |
|
I can't wrk out how to do streaming, and don't want to maintain this, looking for a volunteer to pick it up @cbruyndoncx @katzdave |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds initial support for OpenAI's Responses API, which is used by the new gpt-5-codex and gpt-5.1-codex models. The implementation introduces a parallel code path that detects these specific models and routes them to the /v1/responses endpoint instead of the standard /v1/chat/completions endpoint.
Key changes:
- Added new Rust types to deserialize Responses API format (different structure from standard chat completions)
- Implemented conversion logic that transforms structured conversation messages into a flat text input format
- Streaming requests for Responses API models fall back to non-streaming calls wrapped in a single-item stream
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| crates/goose/src/providers/openai.rs | Adds model detection, routing logic to use Responses API for gpt-5-codex models, and fallback streaming implementation |
| crates/goose/src/providers/formats/openai.rs | Defines Responses API types and implements conversion functions between standard message format and Responses API format |
No worries I can pick this up. Will give it it a try tomorrow. |
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Copilot reviewed 2 out of 2 changed files in this pull request and generated 4 comments.
| .unwrap() | ||
| .insert("tools".to_string(), json!(tools_spec)); | ||
| } | ||
|
|
||
| if let Some(temp) = model_config.temperature { | ||
| payload | ||
| .as_object_mut() | ||
| .unwrap() | ||
| .insert("temperature".to_string(), json!(temp)); | ||
| } | ||
|
|
||
| if let Some(tokens) = model_config.max_tokens { | ||
| payload | ||
| .as_object_mut() | ||
| .unwrap() |
Copilot
AI
Nov 20, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The .unwrap() calls could panic if the payload is not a valid JSON object. Use proper error handling with ok_or_else() or similar to return an anyhow::Result error.
| .unwrap() | |
| .insert("tools".to_string(), json!(tools_spec)); | |
| } | |
| if let Some(temp) = model_config.temperature { | |
| payload | |
| .as_object_mut() | |
| .unwrap() | |
| .insert("temperature".to_string(), json!(temp)); | |
| } | |
| if let Some(tokens) = model_config.max_tokens { | |
| payload | |
| .as_object_mut() | |
| .unwrap() | |
| .ok_or_else(|| anyhow!("Payload is not a JSON object"))? | |
| .insert("tools".to_string(), json!(tools_spec)); | |
| } | |
| if let Some(temp) = model_config.temperature { | |
| payload | |
| .as_object_mut() | |
| .ok_or_else(|| anyhow!("Payload is not a JSON object"))? | |
| .insert("temperature".to_string(), json!(temp)); | |
| } | |
| if let Some(tokens) = model_config.max_tokens { | |
| payload | |
| .as_object_mut() | |
| .ok_or_else(|| anyhow!("Payload is not a JSON object"))? |
| let parsed_args = if arguments.is_empty() { | ||
| json!({}) | ||
| } else { | ||
| serde_json::from_str(arguments).unwrap_or_else(|_| json!({})) |
Copilot
AI
Nov 20, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The .unwrap_or_else() silently converts invalid JSON to an empty object. Consider logging this error or using a more explicit error message.
| .unwrap() | ||
| .insert("tools".to_string(), json!(tools_spec)); | ||
| } | ||
|
|
||
| if let Some(temp) = model_config.temperature { | ||
| payload | ||
| .as_object_mut() | ||
| .unwrap() | ||
| .insert("temperature".to_string(), json!(temp)); | ||
| } | ||
|
|
||
| if let Some(tokens) = model_config.max_tokens { | ||
| payload | ||
| .as_object_mut() | ||
| .unwrap() |
Copilot
AI
Nov 20, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The .unwrap() calls could panic if the payload is not a valid JSON object. Use proper error handling with ok_or_else() or similar to return an anyhow::Result error.
| .unwrap() | |
| .insert("tools".to_string(), json!(tools_spec)); | |
| } | |
| if let Some(temp) = model_config.temperature { | |
| payload | |
| .as_object_mut() | |
| .unwrap() | |
| .insert("temperature".to_string(), json!(temp)); | |
| } | |
| if let Some(tokens) = model_config.max_tokens { | |
| payload | |
| .as_object_mut() | |
| .unwrap() | |
| .ok_or_else(|| anyhow!("payload is not a JSON object"))? | |
| .insert("tools".to_string(), json!(tools_spec)); | |
| } | |
| if let Some(temp) = model_config.temperature { | |
| payload | |
| .as_object_mut() | |
| .ok_or_else(|| anyhow!("payload is not a JSON object"))? | |
| .insert("temperature".to_string(), json!(temp)); | |
| } | |
| if let Some(tokens) = model_config.max_tokens { | |
| payload | |
| .as_object_mut() | |
| .ok_or_else(|| anyhow!("payload is not a JSON object"))? |
| .unwrap() | ||
| .insert("tools".to_string(), json!(tools_spec)); | ||
| } | ||
|
|
||
| if let Some(temp) = model_config.temperature { | ||
| payload | ||
| .as_object_mut() | ||
| .unwrap() | ||
| .insert("temperature".to_string(), json!(temp)); | ||
| } | ||
|
|
||
| if let Some(tokens) = model_config.max_tokens { | ||
| payload | ||
| .as_object_mut() | ||
| .unwrap() |
Copilot
AI
Nov 20, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The .unwrap() calls could panic if the payload is not a valid JSON object. Use proper error handling with ok_or_else() or similar to return an anyhow::Result error.
| .unwrap() | |
| .insert("tools".to_string(), json!(tools_spec)); | |
| } | |
| if let Some(temp) = model_config.temperature { | |
| payload | |
| .as_object_mut() | |
| .unwrap() | |
| .insert("temperature".to_string(), json!(temp)); | |
| } | |
| if let Some(tokens) = model_config.max_tokens { | |
| payload | |
| .as_object_mut() | |
| .unwrap() | |
| .ok_or_else(|| anyhow!("payload is not a JSON object"))? | |
| .insert("tools".to_string(), json!(tools_spec)); | |
| } | |
| if let Some(temp) = model_config.temperature { | |
| payload | |
| .as_object_mut() | |
| .ok_or_else(|| anyhow!("payload is not a JSON object"))? | |
| .insert("temperature".to_string(), json!(temp)); | |
| } | |
| if let Some(tokens) = model_config.max_tokens { | |
| payload | |
| .as_object_mut() | |
| .ok_or_else(|| anyhow!("payload is not a JSON object"))? |
|
#5837 have a streaming version generally working now. Working on cleaning it up. |
|
can I close this one @katzdave in favor of yours? |
Yep. Pushing on that some more today. |
well a start to it...
addresses: #5270
kind of hate that we have to do this.