Use llama 3 instead of codellama #26
magneticmojo
started this conversation in
General
Replies: 2 comments
-
There is already a ticket regarding allowing other models. I will investigate and try to understand it's feasibility on computers with limited resources. |
Beta Was this translation helpful? Give feedback.
0 replies
-
There is also codeqwen and deepseek 7b that should be good. I think it would be better to have a default model, let the user optionally change it, and hardcode the custom parameter into the request instead of making 2 modelfils and 2 new alias on ollama side for the request and suggest. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Could you make it configurable to chose llama 3 instead of codellama?
Beta Was this translation helpful? Give feedback.
All reactions