Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] - OPENAI_API only working with ollama #4473

Closed
5 of 6 tasks
hnico21 opened this issue Oct 30, 2024 · 4 comments
Closed
5 of 6 tasks

[BUG] - OPENAI_API only working with ollama #4473

hnico21 opened this issue Oct 30, 2024 · 4 comments
Labels
bug Something isn't working triage

Comments

@hnico21
Copy link

hnico21 commented Oct 30, 2024

First Check

  • This is not a feature request.
  • I added a very descriptive title to this issue (title field is above this).
  • I used the GitHub search to find a similar issue and didn't find it.
  • I searched the Mealie documentation, with the integrated search.
  • I already read the docs and didn't find an answer.
  • This issue can be replicated on the demo site (https://demo.mealie.io/).

What is the issue you are experiencing?

I want to switch the AI. Im currently runnig ollama succsessfully, but I can't get GPT-4 to work. I also tried ollama and openai API provided by my Open-Webui instance (a Webui for ollama and other AIs) .
#4443 has a similar problem.

Steps to Reproduce

      OPENAI_BASE_URL: "http://192.168.0.5:11434/v1"
      OPENAI_API_KEY: "ollama"
#      OPENAI_BASE_URL: http://open-webui:8080/openai
#      OPENAI_API_KEY: "sk-Private"
#      OPENAI_BASE_URL: "https://api.openai.com/v1"
#      OPENAI_API_KEY: "sk-Private"
#      OPENAI_MODEL: "gpt-4"
      OPENAI_MODEL: "llama3.2:latest"
      OPENAI_WORKERS: 1
      OPENAI_ENABLE_IMAGE_SERVICES: false

Please provide relevant logs

relevant Open-Webui logs:

When I make an openai-API call to Open-Webui to send it to Openai

ERROR [open_webui.apps.openai.main] 400, message='Bad Request', url='https://api.openai.com/v1/chat/completions'
Traceback (most recent call last):
  File "/app/backend/open_webui/apps/openai/main.py", line 475, in generate_chat_completion
    r.raise_for_status()
  File "/usr/local/lib/python3.11/site-packages/aiohttp/client_reqrep.py", line 1121, in raise_for_status
    raise ClientResponseError(
aiohttp.client_exceptions.ClientResponseError: 400, message='Bad Request', url='https://api.openai.com/v1/chat/completions'
INFO:     172.27.0.14:53236 - "POST /openai/chat/completions HTTP/1.1" 400 Bad Request

When I make an openai-API call to Open-Webui for other AI (cloudflare AI Worker):
INFO: 172.27.0.14:49148 - "POST /openai/chat/completions HTTP/1.1" 200 OK

When I make an ollama-API call to Open-Webui for ollama
INFO: 172.27.0.14:51222 - "POST /ollama/v1/chat/completions HTTP/1.1" 400 Bad Request

Mealie Version

v2.1.0

Deployment

Docker (Linux)

Additional Deployment Details

running on ubuntu server with mealie and open-webui in docker and ollama on it. also NPM as reverse proxy. Open-Webui-API-Endpoints

@hnico21 hnico21 added bug Something isn't working triage labels Oct 30, 2024
@gmag11
Copy link

gmag11 commented Oct 30, 2024

You may try mistral models. They have a generous free tier. It may happen that their OpenAI API is not totally compliant. In that case you can use litellm projects as a proxy for many local (ollama) or cloud models, including mistral.

@michael-genson
Copy link
Collaborator

I'm not familiar with Open Web UI, but for the OpenAI integration, you shouldn't provide the base URL to Mealie, you only need to provide the OpenAI API Key

@hnico21
Copy link
Author

hnico21 commented Oct 30, 2024

Thanks for your replies. Apparently I overcomplicated the issue. Just providing the API-Key did the trick.
Now it looks like the issue only occurs when I try to specify "gpt-4" as a model. "gpt-4o" works

      OPENAI_API_KEY: "sk-Private"
      OPENAI_MODEL: "gpt-4"

Apparently there is a problem with cloudflares-API:

{
  "messages": [
    {
      "role": "system",
      "content": "You are a bot that parses user input into recipe ingredients. You will ......."
    },
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "[\"2 tbsp minced cilantro, leaves and stems\"]"
        }
      ]
    }
  ],
  "return_citations": true,
  "return_images": true
}

Answer HTTP 400:

{
  "errors": [
    {
      "message": "AiError: Bad input: must have required property 'prompt', 'messages1content' must be string, must match exactly one schema in oneOf",
      "code": 5006
    }
  ],
  "success": false,
  "result": {},
  "messages": []
}

I will contact cloudflare for this.

@hnico21 hnico21 closed this as completed Oct 30, 2024
@michael-genson
Copy link
Collaborator

Glad you got it working!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

3 participants