Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Not understanding warning message from autogen during ConverseableAgent creation. #3157

Closed
garyjkuehn opened this issue Jul 17, 2024 · 3 comments · Fixed by #3569
Closed

Comments

@garyjkuehn
Copy link

Describe the bug

The following warning is issued by autogen during the ConverseableAgent creation even though the code functions as expected.

Sample response from the two agent conversation relying upon a local function to return weather in order to determine tour suggestions.

[autogen.oai.client: 07-17 14:33:30] {164} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model.
user_proxy (to assistant2):
What are some interesting things to do while in Washington DC today?

USING AUTO REPLY...
assistant2 (to user_proxy):
***** Suggested tool call (call_GZJyzpI9Rf9KtNvEEL0slu16): get_weather *****
Arguments:
{"location":"Washington DC"}

user_proxy (to assistant2):
***** Response from calling tool (call_GZJyzpI9Rf9KtNvEEL0slu16) *****
Heavy percipitation is expected throughout the day in Washington DC. T-Storms are very possible.



USING AUTO REPLY...
assistant2 (to user_proxy):

Here are five suggested indoor activities to enjoy in Washington DC due to the heavy precipitation and possible thunderstorms:

  1. Visit the Smithsonian museums.
  2. Explore the National Gallery of Art.
  3. Tour the U.S. Capitol Visitor Center.
  4. Discover the Library of Congress.
  5. Experience the International Spy Museum.

Feel free to let me know if you need more recommendations or information!

Steps to reproduce

The llm_config contains the model and api_key in this format:
model = os.getenv('OPENAI_MODEL')
api_key = os.getenv('OPENAI_API_KEY')
config_list = [
{
"model": model,
"api_key": api_key,
}
]
llm_config = { "config_list": config_list}

assistant2 = ConversableAgent(
name="assistant2",
system_message="""You are a helpful tour guide assistant that can recommend points of interest based upon
......
Return 'TERMINATE' when the task is done.""",
llm_config=llm_config,
)
print(assistant2.print_usage_summary)
print(assistant2.llm_config)

Model Used

gpt-3.5-turbo

Expected Behavior

The behavior is correct, I can't figure out how to clear the warning.

Screenshots and logs

No response

Additional Information

AutoGen Version: 0.2.32
Operating System: macOS Montery 12.6.2
Python Version: 3.12.4

@garyjkuehn garyjkuehn added the bug label Jul 17, 2024
@koorukuroo
Copy link

Check your API_KEY.
It probably starts with sk-None..., which is probably because it's supposed to have a specific value, like sk-proj..., so creating an API_KEY that belongs to a project rather than a legacy key should fix it.

Of course, this is a bug! :D

@garyjkuehn
Copy link
Author

garyjkuehn commented Aug 2, 2024

Created a project key and that resolved this particular warning. Thank you

@garyjkuehn garyjkuehn reopened this Aug 2, 2024
@marcocello
Copy link

Hello I am having the same issue, I am trying both with "User API keys (Legacy)" and also with the new Project API keys ("sk-proj-...)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants