-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Error occurred while processing message: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable #805
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I am getting similar error too. My code
|
For the error above, it suggests that you may have a model in your config_list that has not set some details correctly. Can you confirm that you have set the Also, please try the latest version of autogenra as this has extended the UI to allow for modifying the models. pip install -U autogenra config_list = [
{
"model": "<path>\mistral-7b-instruct-v0.1.Q6_K.gguf",
"base_url": "http://127.0.0.1:8000/v1",
"api-type": "open_ai",
"api-key": "sk-abc"
}
]
llm_config = {"config_list": config_list, "seed": 10}
<My agents>
manager = GroupChatManager(groupchat=group_chat, llm_config=llm_config)
user_proxy.initiate_chat(manager, message="Hello, How are you ?") |
Thanks @victordibia for the reply, I solve the issue by assigning the proxy agent an LLM as well. If I delete all the models for proxy agent and leave it a blank, then this error comes up. If I assign a model to proxy agent, then it works. The question is that, I saw the examples which autogen supports proxy agent without LLM and only assistant with LLM and I tried this setting before. Besides, I saw your ui which model settings are totally Blank (different from mine) |
This is a helpful article |
Hi all, I am having the same error and having a similar set. I have set up pyautogen(version-0.2.2) am using oobaboga webui for hosting my local LLM(mistral-7b-instruct). It runs fine when i query using user_proxy agent. Now I am developing a function tool following - https://github.com/microsoft/autogen/blob/main/notebook/agentchat_function_call.ipynb and that throws an error. My code is as follows: config_list = [ create a UserProxyAgent instance named "user_proxy"user_proxy = autogen.UserProxyAgent( define functions according to the function descriptionfrom IPython import get_ipython @user_proxy.register_for_execution() @user_proxy.register_for_execution() start the conversationuser_proxy.initiate_chat( |
Closing this due to inactivity. References
|
I found the same issue, do you know why? |
Hi, I start the autogenra ui successfully and I setting the proxy agent with no LLM model and remain the system message same as default. For the assistant agent I use self-host Mistral-7b, I gave the model name and base url, for the api_key, I gave "null". When I start to send the message and this error come up. How can I deal with this? Thanks
The text was updated successfully, but these errors were encountered: