-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Field for API key missing when using custom openai compatible API #427
Comments
Indeed we were not sure what the right abstraction was here. Do most of these services require only one API key? Is it always using the same header or do we need to give people "multiple custom header" edit access. Because this question wasn't obvious, we so far don't allow any changes.. This allows you to use local models through ollama or other local inference utilities, but does not let you use alternative online providers. It seems like your provider is AWS bedrock. What does it need for authentication? Simply the API key ? |
Most OpenAI compatible APIs require an API key and they're all the same thankfully - simply a a standard bearer token in the header. For example: Completions
Listing models
etc... |
Alright in that case we can simply add an optional API key field to the settings page. Do you want to take a stab at it by any chance? |
Hey @nichochar I'm back. Ready to work on this :) |
Awesome, if you check this PR I recently made it shows most of the work needed to get this working (to add a new field to the config, you'll have to use the |
So, when can we use the API key field? thanks |
any update on this? |
When you select to use a custom openai compatible API endpoint the API Key field disappears.
This makes using an openai compatible API impossible unless you have one that doesn't have an API key.
The text was updated successfully, but these errors were encountered: