Skip to content

Conversation

@shawn111
Copy link
Contributor

@shawn111 shawn111 commented Aug 26, 2025

for OpenAI compatible might does't support gpt-4o-mini

it can fix #4350

@shawn111
Copy link
Contributor Author

shawn111 commented Aug 26, 2025

@katzdave I can bypass your OPEN_AI_DEFAULT_FAST_MODEL setting by this commit.
But no sure is it the best way.

Maybe like what you said, create a new GOOSE_FAST_MODEL variable to handle maybe more flexible.

@DOsinga
Copy link
Collaborator

DOsinga commented Aug 26, 2025

this will certainly fix it, but it's not really the way I think.

@shawn111 shawn111 force-pushed the only-openai-run-fast branch from 5e266c1 to 23c890c Compare August 27, 2025 02:06
for OpenAI compatible might does't support gpt-4o-mini

Signed-off-by: Shawn Wang <shawn111@gmail.com>
@shawn111 shawn111 force-pushed the only-openai-run-fast branch from 23c890c to 590898e Compare August 27, 2025 02:10
@shawn111
Copy link
Contributor Author

crates/goose/src/providers/databricks.rs did some similar check about fast model

@shawn111 shawn111 changed the title fix: workaround only openai support fast model fix: openai add checking about fast model Aug 27, 2025
@shawn111
Copy link
Contributor Author

It is fixed by Fast model falls back to regular #4375

@shawn111 shawn111 closed this Aug 28, 2025
@shawn111 shawn111 deleted the only-openai-run-fast branch August 29, 2025 00:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug] Goose attempts to use "gpt-4o-mini" model when use OpenAI (GPT-4 and other OpenAI models, including OpenAI compatible ones)

2 participants