-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bad environment variable in README #100
Comments
Hey, which holmes version are you running and with which LLM model? In the latest holmes version use LiteLLM under the hood, which uses |
I used the latest brew installation. |
Thanks, you're definitely on the latest version using LiteLLM. What was the exact --model flag that you passed holmes? |
When I set the environment variable with and I then call I get the following error: When I go with |
Got it, thanks. And to clarify, this works if you go with |
Yes, correct. If I use |
@AIUser2324, do either of the updated instructions for Ollama here work for you? https://github.com/robusta-dev/holmesgpt/pull/133/files#diff-b335630551682c19a781afebcf4d07bf978fb1f8ac04c6bf87428ed5106870f5 On my side, Holmes is able to connect in both cases, but I'm not getting good results. Perhaps that is because I'm not using the instruct model? In any event, are you able to get decent results with either:
|
The instructions for using a self-hosted LLM in the README file say that you need to set the OPENAI_API_BASE variable. This should be OPENAI_BASE_URL to work properly.
The text was updated successfully, but these errors were encountered: