-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using docker with ollama all I get is a spinning wheel #396
Comments
What Ollama address are you using? |
in the docker compose ollama: and in the config file OLLAMA = "http://host.docker.internal:11434/" I've checked with netstat the port is open |
Confirm! I have the exact same error. Ollama is running on another host in the network in my case. So i set the IP for that host like this: Any (debug) log that i can provide or generate ? |
The cause for this issue for me was:
I was deploying the stack on a remote server and using Tailscale to make the connections. Building the front end with the following args solved the issue:
I don't have the logs anymore. Inspecting the traffic in the browser, I noticed that UI was hanging up because a call was being made to Hope this helps. |
Since you're running Ollama with the other images, you can just load it over the Perplexica network and then access it by |
try pulling llama3 i think then restart, it should work. |
I had the same issue using ollama and searxng on 2 other docker stack. Then I decided to build the images locally (perplexica-backend:main and perplexica-frontend:main) instead of pulling from docker.io - then it worked... maybe there is an issue with the image on dockerhub... |
There is no issue with images on the Docker hub, they are hardcoded to local reference since when we built images, the next public vars gets bundled in the javascript so there is no way we could change it. I've mentioned this in the update guide as well, so you need to build your own images if you wish to use some other IP than localhost. There's nothing I could do. |
I encountered the same issue because I installed Skybox, which is also using port 3001. The perplexica calling model is also using port 3001, which conflicts |
@ItzCrazyKns I haven't been able to get this going and there are few more opening new issues about what sounds the same. Would you mind having a dummy foolpropof step-by-step guide with included ollama or maybe even better, openwebui, assuming you have it running well? |
Describe the bug
under locahost:3000 I'm only getting a spinning wheel but no search
To Reproduce
I used docker-compose.yml including ollama which I linked in the config.toml
config.toml
OLLAMA = "http://host.docker.internal:11434"
The text was updated successfully, but these errors were encountered: