-
Notifications
You must be signed in to change notification settings - Fork 19.1k
Closed as not planned
Description
Feature request
The ollama integration assumes that all models are served on "localhost:11434", if the ollama service is hosted on a different machine, the integration will fail.
Can we add an environment variable that if present overrides this url, so the correct url for the ollama server can be set.
Motivation
In My setup ollama sits on a separate machine that is resourced for serving LLMs.
Your contribution
I'm afraid I don't have any knowledge of python, go, cpp and rust only.
oldsiks, LawrenceLinn, marait123, julianschelb, senchpimy and 4 moremfreeman451, oldsiks, LawrenceLinn and HuronExplodium
Metadata
Metadata
Assignees
Labels
No labels