Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use Ollama models - Adding chat/completions to URL #436

Open
clickbrain opened this issue Nov 7, 2024 · 1 comment
Open

Unable to use Ollama models - Adding chat/completions to URL #436

clickbrain opened this issue Nov 7, 2024 · 1 comment

Comments

@clickbrain
Copy link

clickbrain commented Nov 7, 2024

I have my Ollama endpoint on another machine and have a domain mapped to the endpoint. When I try to use it I get an error and logs seems to be saying it is not found. The model is installed, but it looks like Source is adding chat/completions to the URL. Is there any way to change this?

@clickbrain clickbrain changed the title Unable to use Ollama models - Affing chat/completions to URL Unable to use Ollama models - Adding chat/completions to URL Nov 7, 2024
@jaimedevelop
Copy link

It happens the same to me. Ollama runs but somehow srcbook cannot get the models. I have srcbook running through WSL on windows.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants