would it work with ollama? #1111
Replies: 2 comments
-
Yes, it works with ollama |
Beta Was this translation helpful? Give feedback.
-
I just cloned GPT-Pilot, had Ollama previously setup. All you have to change is change the base_url, api_key and give it a model name. My config: "llm": {
"openai": {
// Base url to the provider/server, omitting the trailing "chat/completions" part.
"base_url": "http://localhost:11434/v1",
"api_key": "Ollama",
"connect_timeout": 60.0,
"read_timeout": 20.0
},
...
...
"agent": {
"default": {
"provider": "openai",
"model": "llama3.3",
"temperature": 0.5
}
}, Make sure you have the v1 at the end of the base_url. I have a complex app running now, should have done something simpler, might take a while. |
Beta Was this translation helpful? Give feedback.
-
I have ollama up and running. would this work with that?
Beta Was this translation helpful? Give feedback.
All reactions