-
Notifications
You must be signed in to change notification settings - Fork 328
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Query] How to set up for local models #172
Comments
Got, and example examples/ollama_example.py, Thanks |
Just for completeness, there is a problem with the example code. So you should try:
|
@QueryType why does the OPENAI_API_KEY need to be loaded for ollama to work?? |
You can just pass a dummy key.
|
Hmm that still doesnt explain why you need load anything at all for OPENAI when you are using Ollama.. |
Hi ,
Sorry for this maybe n00b question, I generally serve models using llama.cpp, ollama or textgenui on OpenAI compatible interface. How can I set up e11, to make it hit these services?
Thanks,
The text was updated successfully, but these errors were encountered: