Custom Ollama Self Hosted LLM #88
Unanswered
Python1012020
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi! I really want to know how can I integrate self hosted Ollama LLM instead of the default LLM.
Also want to know in the code where is it calling for the LLMs like open ai or others so that we can update from there?
Beta Was this translation helpful? Give feedback.
All reactions