-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama: Forbidden (403) while fetching models on localhost #276
Comments
Hi @sudhamjayanthi thanks for reporting - good for us to document this expected behavior. The following is the issue:
A quick schematic:
A few ways to fix this issue:
Let me know if this helps to fix this networking configuration issue. Edit: I've updated the Ollama Deployment docs |
Closing as an extensive doc is now provided in the app docs. |
i had the netwrok error previsouly on #276 and I tried their solutions and am now running big-AGI in a docker container and ollama locally, how do I connect them though as I am now getting a new error: [Issue] Ollama: (network) fetch failed - Error: connect ECONNREFUSED 127.0.0.1:11434 |
Describe the bug
Fails to pull models from
Ollama
server on the web version when clicking on fetch models button![Issue] Ollama: Forbidden (403) - is http://localhost:11434/api/tags accessible by the server?
Where is it happening?
To Reproduce
Steps to reproduce the behavior:
Run Ollama Server locally with
OLLAMA_ORIGINS=https://get.big-agi.com/ ollama serve
Go to the web client.
Go to Models > Add > Ollama > Click on models button
Expected behavior
It pulls the following models from the server without any issues. (screenshot shows the JSON when I hit the same API endpoint in my browser)
Screenshots / context
Attached in relevant places above.
The text was updated successfully, but these errors were encountered: