-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to connect to the server... Seems like you are using the custom OpenAI provider... #471
Comments
Thank for the response. Linux: Use http://<private_ip_of_host>:11434 I know the port isnt blocked since Im using open-webui in docker and it connects to ollama on http://127.0.0.1:11434 |
Please remove the |
Removing the slash didnt work. Here is the result of the curl request: `$ curl -i http://192.168.111.3:11434 Ollama is running(base)` |
Did you pressed the blue save button after removing the |
Yes, I did save it. |
Describe the bug
After install when I go to the url http://localhost:3000/ I get two errors:
"Failed to connect to the server. Please try again later."
and
"Seems like you are using the custom OpenAI provider, please open the settings and configure the API key and base URL"
To Reproduce
Steps to reproduce the behavior: go to the url http://localhost:3000/
Expected behavior
I expected perplexica screen to load.
Screenshots
Additional context
Ollama is not installed in docker but is running and working http://127.0.0.1:11434/
Searxng is running in docker and is running and working http://127.0.0.1:32768/
I am running on ubuntu PopOS
The text was updated successfully, but these errors were encountered: