-
-
Notifications
You must be signed in to change notification settings - Fork 724
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] - OPENAI_API only working with ollama #4473
Comments
You may try mistral models. They have a generous free tier. It may happen that their OpenAI API is not totally compliant. In that case you can use litellm projects as a proxy for many local (ollama) or cloud models, including mistral. |
I'm not familiar with Open Web UI, but for the OpenAI integration, you shouldn't provide the base URL to Mealie, you only need to provide the OpenAI API Key |
Thanks for your replies. Apparently I overcomplicated the issue. Just providing the API-Key did the trick.
Apparently there is a problem with cloudflares-API:
Answer HTTP 400:
I will contact cloudflare for this. |
Glad you got it working! |
First Check
What is the issue you are experiencing?
I want to switch the AI. Im currently runnig ollama succsessfully, but I can't get GPT-4 to work. I also tried ollama and openai API provided by my Open-Webui instance (a Webui for ollama and other AIs) .
#4443 has a similar problem.
Steps to Reproduce
Please provide relevant logs
relevant Open-Webui logs:
When I make an openai-API call to Open-Webui to send it to Openai
When I make an openai-API call to Open-Webui for other AI (cloudflare AI Worker):
INFO: 172.27.0.14:49148 - "POST /openai/chat/completions HTTP/1.1" 200 OK
When I make an ollama-API call to Open-Webui for ollama
INFO: 172.27.0.14:51222 - "POST /ollama/v1/chat/completions HTTP/1.1" 400 Bad Request
Mealie Version
v2.1.0
Deployment
Docker (Linux)
Additional Deployment Details
running on ubuntu server with mealie and open-webui in docker and ollama on it. also NPM as reverse proxy. Open-Webui-API-Endpoints
The text was updated successfully, but these errors were encountered: