Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Ollama Support #2073

Merged
merged 1 commit into from
Oct 13, 2024
Merged

Conversation

bocklucas
Copy link
Contributor

Adding Ollama Support

Changes

  • Added support for Ollama in settings
    • Added a Base URL
    • Added a Model settings with a default to the current one. Current OpenAI users can now configure which model they use. Input is a text field, not a dropdown unfortunately, would recommend adding support in another PR
    • Added a Max Token field to enable configuration for both OpenAI and Ollama users

QA

For development all I did was docker-compose up --build -d and tested / verified my changes against http://localhost:3000

It should be noted as I don't have an OpenAI account I wasn't able to test to make sure anything didn't break, anyone QAing this with an OpenAI account feel free to suggest changes and I can make them

@bocklucas bocklucas mentioned this pull request Oct 12, 2024
1 task
@AmruthPillai AmruthPillai merged commit e29f973 into AmruthPillai:main Oct 13, 2024
@wakawakaaa
Copy link

@bocklucas Please explain how to configure ollama when using docker-compose for Reactive-Resume thanks

@bocklucas
Copy link
Contributor Author

bocklucas commented Nov 26, 2024

@wakawakaaa could you elaborate more? Are you able to post your docker-compose file?

My setup is I run Ollama on windows and Reactive Resume with docker-compose.

So my config looks like

API Key: sk-1234567890abcdef
Base URL: http://192.168.x.x:11434/v1
Model: llama3.1:8b
Max Tokens: 4096

I should note as well that 192.168.x.x is the IP of my windows machine, just insert whatever the IP of your Ollama instance is or the URL it can be reached at.

Screenshot from 2024-11-25 18-46-28

NOTES

  • Based on the Regex code for settings, you MUST end your Base URL with /v1 otherwise the Saved button won't be enabled
  • I noticed recently that when my Ollama was throwing CORS errors so I had to set OLLAMA_ORIGINS: * environment variable on my Windows machine so it would accept requests

Hope this helps!

@thinkjk
Copy link

thinkjk commented Jan 5, 2025

@wakawakaaa could you elaborate more? Are you able to post your docker-compose file?

My setup is I run Ollama on windows and Reactive Resume with docker-compose.

So my config looks like

API Key: sk-1234567890abcdef Base URL: http://192.168.x.x:11434/v1 Model: llama3.1:8b Max Tokens: 4096

I should note as well that 192.168.x.x is the IP of my windows machine, just insert whatever the IP of your Ollama instance is or the URL it can be reached at.

Screenshot from 2024-11-25 18-46-28

NOTES

* Based on the Regex code for settings, you **MUST** end your Base URL with `/v1` otherwise the `Saved` button won't be enabled

* I noticed recently that when my Ollama was throwing CORS errors so I had to set `OLLAMA_ORIGINS: *` environment variable on my Windows machine so it would accept requests

Hope this helps!

I think @wakawakaaa was asking how to get the Ollama Integration option. Here is what I see:
image

@bocklucas
Copy link
Contributor Author

@wakawakaaa could you elaborate more? Are you able to post your docker-compose file?

My setup is I run Ollama on windows and Reactive Resume with docker-compose.

So my config looks like

API Key: sk-1234567890abcdef Base URL: http://192.168.x.x:11434/v1 Model: llama3.1:8b Max Tokens: 4096

I should note as well that 192.168.x.x is the IP of my windows machine, just insert whatever the IP of your Ollama instance is or the URL it can be reached at.

Screenshot from 2024-11-25 18-46-28

NOTES

* Based on the Regex code for settings, you **MUST** end your Base URL with `/v1` otherwise the `Saved` button won't be enabled

* I noticed recently that when my Ollama was throwing CORS errors so I had to set `OLLAMA_ORIGINS: *` environment variable on my Windows machine so it would accept requests

Hope this helps!

I think @wakawakaaa was asking how to get the Ollama Integration option. Here is what I see:
image

Ahhh gotcha I think the problem is the latest image hasn't been pushed yet, you could build it locally and then should be able to see the setting.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants