Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama - Please provide bettern README or fix the bug #34

Open
magzim21 opened this issue Sep 12, 2024 · 6 comments
Open

ollama - Please provide bettern README or fix the bug #34

magzim21 opened this issue Sep 12, 2024 · 6 comments

Comments

@magzim21
Copy link

The readme says "Set PROVIDER in your environment to ollama"

$ export PROVIDER=ollama
$ export AI_PROVIDER=ollama # tried this too
$ ai-commit -h                                  
Please set the OPENAI_API_KEY environment variable.
@mbenapari
Copy link
Contributor

Hi @magzim21 ,

I've submitted a pull request to address this issue. Hopefully, it will be merged soon!
Feel free to take a look: PR #35.

@sidsarasvati
Copy link

thanks for the issue, yes same, pretty useless README on how to use ollama
The PR seems to be open for over a month now

@insulineru ?

@sidsarasvati
Copy link

Anyway - this helped resolved my issue -
#32

@miguel-san-martin
Copy link

ai-commit --PROVIDER=ollama --MODEL=mistral

@pu-007
Copy link

pu-007 commented Jan 23, 2025

ai-commit --PROVIDER=ollama --MODEL=mistral

need install from git first
#34 (comment)

@mlab817
Copy link

mlab817 commented Mar 14, 2025

I also found that adding MODEL to the environment variables should set the model automatically allowing you to run:

MODEL=mistral

ai-commit

without specifying variables. The command ai-commit is looking for gpt 4.0 mini model

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants