Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It doesn't work except llama2 model. #19

Open
yunghoy opened this issue Jan 10, 2024 · 3 comments
Open

It doesn't work except llama2 model. #19

yunghoy opened this issue Jan 10, 2024 · 3 comments

Comments

@yunghoy
Copy link

yunghoy commented Jan 10, 2024

obsidian-ollama options doesn't allow to be changed unless you change the code directly.

So, it means, it only allows llama2 for now. I'm so lazy, I will try this plugin when you fix the code.

@matbee-eth
Copy link

well thats terribly annoying

@KernelBypass
Copy link

works for me as of now: tried mixtral and a few other ones.

@pfrankov
Copy link

Sadly, seems that this plugin is not maintained anymore. You can try https://github.com/pfrankov/obsidian-local-gpt instead

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants