We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I have a models metadata file in ~/.cache/aider-model-metadata.json, which looks something like this: { "foobar-model": { "input_cost_per_token": 0, "litellm_provider": "openai", "max_input_tokens": 90000, "max_output_tokens": 4096, "max_tokens": 90000, "mode": "chat", "output_cost_per_token": 0 } } There's more model than one though. I run aider with this command: aider \ --model foobar-model \ --model-metadata-file ~/.cache/aider-model-metadata.json \ --config ~/.config/aider/aider.conf.yml v0.73.0: When running on 0.73.0, everything works fine. Using /models I can see the model: /models foobar Models which match "foobar": - foobar-model - openai/foobar-model v0.74.0: When running on 0.74.0, I see that the model isn't there when doing /models: /models foobar No models match "foobar". I get this error when trying to use the model: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. Both versions report the same settings, relevant lines from aider -v: - model: foobar-model - model_metadata_file: /Users/oskar/.cache/aider-model-metadata.json Let me know what other information I can provide.
I have a models metadata file in ~/.cache/aider-model-metadata.json, which looks something like this:
~/.cache/aider-model-metadata.json
{ "foobar-model": { "input_cost_per_token": 0, "litellm_provider": "openai", "max_input_tokens": 90000, "max_output_tokens": 4096, "max_tokens": 90000, "mode": "chat", "output_cost_per_token": 0 } }
There's more model than one though. I run aider with this command:
aider \ --model foobar-model \ --model-metadata-file ~/.cache/aider-model-metadata.json \ --config ~/.config/aider/aider.conf.yml
v0.73.0:
When running on 0.73.0, everything works fine. Using /models I can see the model:
/models
/models foobar Models which match "foobar": - foobar-model - openai/foobar-model
v0.74.0:
When running on 0.74.0, I see that the model isn't there when doing /models:
/models foobar No models match "foobar".
I get this error when trying to use the model:
litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call.
Both versions report the same settings, relevant lines from aider -v:
aider -v
- model: foobar-model - model_metadata_file: /Users/oskar/.cache/aider-model-metadata.json
Let me know what other information I can provide.
Originally posted by @oskarkook in #2928
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Originally posted by @oskarkook in #2928
The text was updated successfully, but these errors were encountered: