404 This model is only supported in v1/responses and not in v1/chat/completions. #8741
Unanswered
ewitte12-ui
asked this question in
Help
Replies: 1 comment
-
|
This ended up being because the newer models required responses. I switched to the pre-release and was able to get them to work. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I've not been able to get past this message. I've tried with and without useLegacyCompletionsEndpoint: true
Config.yaml
name: Local Config
version: 1.0.0
schema: v1
models:
name: GPT-5.1 Codex
provider: openai
model: gpt-5.1-codex # exact model ID from your provider
apiKey: REMOVED # same key you use in Open WebUI
apiBase: https://api.openai.com/v1 # same API URL as in Open WebUI Connections, NO trailing slash
roles:
name: GPT-5.1 Codex Mini
provider: openai
model: gpt-5.1-codex-mini # exact model ID from your provider
apiKey: REMOVED
apiBase: https://api.openai.com/v1
roles:
Beta Was this translation helpful? Give feedback.
All reactions