-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: OpenAI API key format #3345
Comments
Can you show some example code? |
like this: from autogen import ConversableAgent agent = ConversableAgent( [autogen.oai.client: 08-13 18:12:58] {164} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model. |
Hi dear,
The code like this:
import os
from autogen import ConversableAgent
agent = ConversableAgent(
"chatbot",
llm_config={"config_list": [{"model": "gpt-4", "api_key": os.environ.get
("my_OPENAI_API_KEY")}]},
code_execution_config=False, # Turn off code execution, by default it
is off.
function_map=None, # No registered functions, by default it is None.
human_input_mode="NEVER", # Never ask for human input.
)
[autogen.oai.client: 08-13 18:12:58] {164} WARNING - The API key specified
is not a valid OpenAI format; it won't work with the OpenAI-hosted model.
Is there anything wrong with API format?
Eric Zhu ***@***.***> 于2024年8月14日周三 09:39写道:
… Can you show some example code?
—
Reply to this email directly, view it on GitHub
<#3345 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABXXKJMTILXMUYKCRVC7QJTZRKYNVAVCNFSM6AAAAABMN7GZXCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBXGY3DQNBZGQ>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
i also got the similar problem. Code:
|
I am importing using
While my code otherwise executes successfully, this error seems to persist. My API key is formatted in a JSON-style format accordingly:
This bug is a minor annoyance. I have not dug into the source code to understand the issue. It seems to display multiple times... |
It seems to be the openai api key format has slightly changed?
doesn't match with regex in this function: autogen/autogen/oai/openai_utils.py
|
Howdy friends. I have determined that by upgrading autogen to 0.2.34 ( |
Hello there.
Keep up the great work and thanks for the time dedicated to solve this issue! |
Getting the same issue. Does anyone know if this error means your model defaults to using gpt-3.5? |
Getting the same problem. Underscores in the key seems to be the issue. Not sure. |
Same problem |
same here with version 0.2.35 |
you will have to use below config pointing to azure as api type config_list = [ Refer #601 |
Same |
I have the same issue Version libraries: Code: import os
import autogen
from autogen import AssistantAgent, UserProxyAgent
from dotenv import load_dotenv
# Load environment variables from .env file
load_dotenv() # Load the .env file
# Access the API key
api_key = os.environ.get('OPENAI_API_KEY')
llm_config = {"model": "gpt-3.5-turbo-16k", "api_key": api_key} #"api_key": os.environ["OPENAI_API_KEY"]
assistant = AssistantAgent("assistant", llm_config=llm_config)
user_proxy = UserProxyAgent(
"user_proxy", code_execution_config={"executor": autogen.coding.LocalCommandLineCodeExecutor(work_dir="coding")}
)
# Start the chat
user_proxy.initiate_chat(
assistant,
message="Plot a chart of NVDA and TESLA stock price change YTD.",
) Error message: [autogen.oai.client: 09-02 16:54:54] {164} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model. |
Same here (but using Autogen Studio). I tried with 2 different keys, the ones starting with "sk-proj-" and the old ones "sk-" |
Same here, frustrating! |
I did quick check import autogen import autogen import os Enable logginglogging.basicConfig(level=logging.INFO) Load environment variables from .env fileload_dotenv() Add the root directory of the project to the PYTHONPATHsys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(file), '..'))) Configuration for GPT-4V (Azure OpenAI-hosted GPT-4 model)config_list_4v = [ llm_config = {"config_list": config_list_4v, "cache_seed": 42, "temperature": 0.5} Initialize the E-shop Manager agentagent_eshop_manager = MultimodalConversableAgent( Define the test function for chat initiationdef test_chat_initiation():
Call the test functiontest_chat_initiation() PS C:\My Work\12. AI Demo\aiagents4customersupport\tests> & "c:/My Work/12. AI Demo/aiagents4customersupport/.venv/Scripts/python.exe" "c:/My Work/12. AI Demo/aiagents4customersupport/tests/test_chat_initiation.py" .env i used AZURE_OPENAI_API_KEY= what is the right format ??? |
Same here |
Check if your API key has underscores ('_') in it. I solved the warning by switching the API key to a project one. |
I have same issue. when I try older api key, it works. |
+1 ... the old key format "sk-..." works, the new project structure "sk-proj-..." does not |
getting same signed up today so have the format using project key sk-proj- WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model. is there any resolution? |
ya doesnt work with new or old key system |
Same Error- |
+1 please fix this. |
Same here, getting this issue on latest with both a project and legacy key |
this problem still exists today!!!!! |
Same problem as of today. Neither the "proj_" or the "sk-" works for me
|
[autogen.oai.client: 09-27 11:11:44] {184} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model. still facing same error |
I am also facing the same error . And validated that using the latest autogen version . |
Same error. Neither project API key nor user API key works. |
Same error. Although the code works fine. |
+1 |
I switched to an OpenAI API key without any underscore and the warming disappeared. I use pyautogen 0.3.0 |
+1 |
1 similar comment
+1 |
We are using The latest version is 0.2.36 |
same issue |
If you are still facing this issue when using |
OpenAI keys have a different format if it's generated under a project/organization. this format is not covered in the regex of |
Agree! I use the personal user key, then the warning was gone. |
Describe the bug
[autogen.oai.client: 08-13 18:12:58] {164} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model.
When I define an agent using ConversableAgent, The above message appeared. And when I define a reply function using agent.generate_reply and print(reply), it does't work and shows:
UnicodeEncodeError: 'ascii' codec can't encode character '\u201c' in position 7: ordinal not in range(128)
What's wrong with it?
Steps to reproduce
No response
Model Used
gpt-4
Expected Behavior
No response
Screenshots and logs
No response
Additional Information
No response
The text was updated successfully, but these errors were encountered: