Skip to content

Conversation

@michaelneale
Copy link
Collaborator

@michaelneale michaelneale commented Nov 18, 2025

well a start to it...

addresses: #5270

kind of hate that we have to do this.

@cbruyndoncx
Copy link
Contributor

they said sunsetting completions in august 2026, https://platform.openai.com/docs/deprecations/

@katzdave
Copy link
Collaborator

#5694 lays the groundwork for a model database, which we could have a field to check which endpoint to use.

@michaelneale michaelneale marked this pull request as ready for review November 19, 2025 22:06
Copilot AI review requested due to automatic review settings November 19, 2025 22:06
@michaelneale
Copy link
Collaborator Author

@cbruyndoncx interesting - I guess they are getting push back. openrouter offers it via completions, as does everything else. Odd.

@michaelneale
Copy link
Collaborator Author

I can't wrk out how to do streaming, and don't want to maintain this, looking for a volunteer to pick it up @cbruyndoncx @katzdave

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds initial support for OpenAI's Responses API, which is used by the new gpt-5-codex and gpt-5.1-codex models. The implementation introduces a parallel code path that detects these specific models and routes them to the /v1/responses endpoint instead of the standard /v1/chat/completions endpoint.

Key changes:

  • Added new Rust types to deserialize Responses API format (different structure from standard chat completions)
  • Implemented conversion logic that transforms structured conversation messages into a flat text input format
  • Streaming requests for Responses API models fall back to non-streaming calls wrapped in a single-item stream

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
crates/goose/src/providers/openai.rs Adds model detection, routing logic to use Responses API for gpt-5-codex models, and fallback streaming implementation
crates/goose/src/providers/formats/openai.rs Defines Responses API types and implements conversion functions between standard message format and Responses API format

@katzdave
Copy link
Collaborator

I can't wrk out how to do streaming, and don't want to maintain this, looking for a volunteer to pick it up @cbruyndoncx @katzdave

No worries I can pick this up. Will give it it a try tomorrow.

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Copilot AI review requested due to automatic review settings November 20, 2025 15:53
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Copilot reviewed 2 out of 2 changed files in this pull request and generated 4 comments.

Comment on lines +761 to +775
.unwrap()
.insert("tools".to_string(), json!(tools_spec));
}

if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.unwrap()
.insert("temperature".to_string(), json!(temp));
}

if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.unwrap()
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The .unwrap() calls could panic if the payload is not a valid JSON object. Use proper error handling with ok_or_else() or similar to return an anyhow::Result error.

Suggested change
.unwrap()
.insert("tools".to_string(), json!(tools_spec));
}
if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.unwrap()
.insert("temperature".to_string(), json!(temp));
}
if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.unwrap()
.ok_or_else(|| anyhow!("Payload is not a JSON object"))?
.insert("tools".to_string(), json!(tools_spec));
}
if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.ok_or_else(|| anyhow!("Payload is not a JSON object"))?
.insert("temperature".to_string(), json!(temp));
}
if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.ok_or_else(|| anyhow!("Payload is not a JSON object"))?

Copilot uses AI. Check for mistakes.
Comment on lines +819 to +822
let parsed_args = if arguments.is_empty() {
json!({})
} else {
serde_json::from_str(arguments).unwrap_or_else(|_| json!({}))
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The .unwrap_or_else() silently converts invalid JSON to an empty object. Consider logging this error or using a more explicit error message.

Copilot generated this review using guidance from repository custom instructions.
Comment on lines +761 to +775
.unwrap()
.insert("tools".to_string(), json!(tools_spec));
}

if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.unwrap()
.insert("temperature".to_string(), json!(temp));
}

if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.unwrap()
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The .unwrap() calls could panic if the payload is not a valid JSON object. Use proper error handling with ok_or_else() or similar to return an anyhow::Result error.

Suggested change
.unwrap()
.insert("tools".to_string(), json!(tools_spec));
}
if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.unwrap()
.insert("temperature".to_string(), json!(temp));
}
if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.unwrap()
.ok_or_else(|| anyhow!("payload is not a JSON object"))?
.insert("tools".to_string(), json!(tools_spec));
}
if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.ok_or_else(|| anyhow!("payload is not a JSON object"))?
.insert("temperature".to_string(), json!(temp));
}
if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.ok_or_else(|| anyhow!("payload is not a JSON object"))?

Copilot uses AI. Check for mistakes.
Comment on lines +761 to +775
.unwrap()
.insert("tools".to_string(), json!(tools_spec));
}

if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.unwrap()
.insert("temperature".to_string(), json!(temp));
}

if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.unwrap()
Copy link

Copilot AI Nov 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The .unwrap() calls could panic if the payload is not a valid JSON object. Use proper error handling with ok_or_else() or similar to return an anyhow::Result error.

Suggested change
.unwrap()
.insert("tools".to_string(), json!(tools_spec));
}
if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.unwrap()
.insert("temperature".to_string(), json!(temp));
}
if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.unwrap()
.ok_or_else(|| anyhow!("payload is not a JSON object"))?
.insert("tools".to_string(), json!(tools_spec));
}
if let Some(temp) = model_config.temperature {
payload
.as_object_mut()
.ok_or_else(|| anyhow!("payload is not a JSON object"))?
.insert("temperature".to_string(), json!(temp));
}
if let Some(tokens) = model_config.max_tokens {
payload
.as_object_mut()
.ok_or_else(|| anyhow!("payload is not a JSON object"))?

Copilot uses AI. Check for mistakes.
@michaelneale michaelneale marked this pull request as draft November 21, 2025 07:25
@katzdave
Copy link
Collaborator

#5837 have a streaming version generally working now. Working on cleaning it up.

@michaelneale
Copy link
Collaborator Author

can I close this one @katzdave in favor of yours?

@katzdave
Copy link
Collaborator

katzdave commented Dec 1, 2025

can I close this one @katzdave in favor of yours?

Yep. Pushing on that some more today.

@katzdave katzdave closed this Dec 1, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants