You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The following warning is issued by autogen during the ConverseableAgent creation even though the code functions as expected.
Sample response from the two agent conversation relying upon a local function to return weather in order to determine tour suggestions.
[autogen.oai.client: 07-17 14:33:30] {164} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model.
user_proxy (to assistant2):
What are some interesting things to do while in Washington DC today?
USING AUTO REPLY...
assistant2 (to user_proxy):
***** Suggested tool call (call_GZJyzpI9Rf9KtNvEEL0slu16): get_weather *****
Arguments:
{"location":"Washington DC"}
user_proxy (to assistant2):
***** Response from calling tool (call_GZJyzpI9Rf9KtNvEEL0slu16) *****
Heavy percipitation is expected throughout the day in Washington DC. T-Storms are very possible.
USING AUTO REPLY...
assistant2 (to user_proxy):
Here are five suggested indoor activities to enjoy in Washington DC due to the heavy precipitation and possible thunderstorms:
Visit the Smithsonian museums.
Explore the National Gallery of Art.
Tour the U.S. Capitol Visitor Center.
Discover the Library of Congress.
Experience the International Spy Museum.
Feel free to let me know if you need more recommendations or information!
Steps to reproduce
The llm_config contains the model and api_key in this format:
model = os.getenv('OPENAI_MODEL')
api_key = os.getenv('OPENAI_API_KEY')
config_list = [
{
"model": model,
"api_key": api_key,
}
]
llm_config = { "config_list": config_list}
assistant2 = ConversableAgent(
name="assistant2",
system_message="""You are a helpful tour guide assistant that can recommend points of interest based upon
......
Return 'TERMINATE' when the task is done.""",
llm_config=llm_config,
)
print(assistant2.print_usage_summary)
print(assistant2.llm_config)
Model Used
gpt-3.5-turbo
Expected Behavior
The behavior is correct, I can't figure out how to clear the warning.
Check your API_KEY.
It probably starts with sk-None..., which is probably because it's supposed to have a specific value, like sk-proj..., so creating an API_KEY that belongs to a project rather than a legacy key should fix it.
Describe the bug
The following warning is issued by autogen during the ConverseableAgent creation even though the code functions as expected.
Sample response from the two agent conversation relying upon a local function to return weather in order to determine tour suggestions.
[autogen.oai.client: 07-17 14:33:30] {164} WARNING - The API key specified is not a valid OpenAI format; it won't work with the OpenAI-hosted model.
user_proxy (to assistant2):
What are some interesting things to do while in Washington DC today?
user_proxy (to assistant2):
***** Response from calling tool (call_GZJyzpI9Rf9KtNvEEL0slu16) *****
Heavy percipitation is expected throughout the day in Washington DC. T-Storms are very possible.
Here are five suggested indoor activities to enjoy in Washington DC due to the heavy precipitation and possible thunderstorms:
Feel free to let me know if you need more recommendations or information!
Steps to reproduce
The llm_config contains the model and api_key in this format:
model = os.getenv('OPENAI_MODEL')
api_key = os.getenv('OPENAI_API_KEY')
config_list = [
{
"model": model,
"api_key": api_key,
}
]
llm_config = { "config_list": config_list}
assistant2 = ConversableAgent(
name="assistant2",
system_message="""You are a helpful tour guide assistant that can recommend points of interest based upon
......
Return 'TERMINATE' when the task is done.""",
llm_config=llm_config,
)
print(assistant2.print_usage_summary)
print(assistant2.llm_config)
Model Used
gpt-3.5-turbo
Expected Behavior
The behavior is correct, I can't figure out how to clear the warning.
Screenshots and logs
No response
Additional Information
AutoGen Version: 0.2.32
Operating System: macOS Montery 12.6.2
Python Version: 3.12.4
The text was updated successfully, but these errors were encountered: