Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using docker with ollama all I get is a spinning wheel #396

Open
andreascschmidt opened this issue Oct 6, 2024 · 10 comments
Open

Using docker with ollama all I get is a spinning wheel #396

andreascschmidt opened this issue Oct 6, 2024 · 10 comments
Labels
bug Something isn't working

Comments

@andreascschmidt
Copy link

Describe the bug
under locahost:3000 I'm only getting a spinning wheel but no search

To Reproduce
I used docker-compose.yml including ollama which I linked in the config.toml

services:
  searxng:
    image: docker.io/searxng/searxng:latest
    volumes:
      - ./searxng:/etc/searxng:rw
    ports:
      - 4000:8080
    networks:
      - perplexica-network
    restart: unless-stopped

  perplexica-backend:
    build:
      context: .
      dockerfile: backend.dockerfile
    image: itzcrazykns1337/perplexica-backend:main
    environment:
      - SEARXNG_API_URL=http://searxng:8080
    depends_on:
      - searxng
    ports:
      - 3001:3001
    volumes:
      - backend-dbstore:/home/perplexica/data
      - ./config.toml:/home/perplexica/config.toml
    extra_hosts:
      - 'host.docker.internal:host-gateway'
    networks:
      - perplexica-network
    restart: unless-stopped

  perplexica-frontend:
    build:
      context: .
      dockerfile: app.dockerfile
      args:
        - NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
        - NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001
    image: itzcrazykns1337/perplexica-frontend:main
    depends_on:
      - perplexica-backend
    ports:
      - 3000:3000
    networks:
      - perplexica-network
    restart: unless-stopped

  ollama:
    volumes:
      - ./ollama:/root/.ollama
    ports:
      - 11434:11434
    container_name: ollama
    image: ollama/ollama

networks:
  perplexica-network:

volumes:
  backend-dbstore:

config.toml
OLLAMA = "http://host.docker.internal:11434"

Screenshot from 2024-10-06 15-06-54

@andreascschmidt andreascschmidt added the bug Something isn't working label Oct 6, 2024
@ItzCrazyKns
Copy link
Owner

What Ollama address are you using?

@andreascschmidt
Copy link
Author

andreascschmidt commented Oct 6, 2024

in the docker compose

ollama:
volumes:
- ./ollama:/root/.ollama
ports:
- 11434:11434
container_name: ollama
image: ollama/ollama

and in the config file OLLAMA = "http://host.docker.internal:11434/"
I also tried localhost

I've checked with netstat the port is open

@BeNeDeLuX
Copy link

Confirm! I have the exact same error.

Ollama is running on another host in the network in my case. So i set the IP for that host like this:
OLLAMA = "http://172.16.17.28:11434" # Ollama API URL - http://host.docker.internal:11434
I recieve a "Ollama is running", if i type the set Url with Port in the browser.

Any (debug) log that i can provide or generate ?
Thanks for any help

@taoi11
Copy link

taoi11 commented Oct 8, 2024

The cause for this issue for me was:

      args:
        - NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api
        - NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001

I was deploying the stack on a remote server and using Tailscale to make the connections.

Building the front end with the following args solved the issue:

      args:
        - NEXT_PUBLIC_API_URL=http://host-name.ts-subdomain.ts.net:3001/api
        - NEXT_PUBLIC_WS_URL=ws://host-name.ts-subdomain.ts.net:3001

I don't have the logs anymore. Inspecting the traffic in the browser, I noticed that UI was hanging up because a call was being made to http://127.0.0.1:3001/api/models.

Hope this helps.

@ItzCrazyKns
Copy link
Owner

in the docker compose

ollama: volumes: - ./ollama:/root/.ollama ports: - 11434:11434 container_name: ollama image: ollama/ollama

and in the config file OLLAMA = "http://host.docker.internal:11434/" I also tried localhost

I've checked with netstat the port is open

Since you're running Ollama with the other images, you can just load it over the Perplexica network and then access it by http://ollama:11434.

@bmad-221B
Copy link

try pulling llama3 i think then restart, it should work.

@waltermo
Copy link

I had the same issue using ollama and searxng on 2 other docker stack. Then I decided to build the images locally (perplexica-backend:main and perplexica-frontend:main) instead of pulling from docker.io - then it worked... maybe there is an issue with the image on dockerhub...

@ItzCrazyKns
Copy link
Owner

There is no issue with images on the Docker hub, they are hardcoded to local reference since when we built images, the next public vars gets bundled in the javascript so there is no way we could change it. I've mentioned this in the update guide as well, so you need to build your own images if you wish to use some other IP than localhost. There's nothing I could do.

@raydoomed
Copy link

raydoomed commented Nov 9, 2024

I encountered the same issue because I installed Skybox, which is also using port 3001. The perplexica calling model is also using port 3001, which conflicts

@andreascschmidt
Copy link
Author

@ItzCrazyKns I haven't been able to get this going and there are few more opening new issues about what sounds the same.

#437
#467

Would you mind having a dummy foolpropof step-by-step guide with included ollama or maybe even better, openwebui, assuming you have it running well?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

7 participants