-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Suggestion: Different LLM for different agents #317
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
This is a good pattern. Each agent can take an Something like the below: Have you tried assigning llm_configs this way yet? A notebook example here. import os
gpt4_llm_config = {
"seed": 42, # seed for caching and reproducibility
"config_list": [{
"model": "gpt-4",
"api_key": os.environ["OPENAI_API_KEY"],
}],
"temperature": 0, # temperature for sampling
"use_cache": True, # whether to use cache
}
gpt35_llm_config = {
"seed": 42, # seed for caching and reproducibility
"config_list": [{
"model": "gpt-3.5-turbo-16k",
"api_key": os.environ["OPENAI_API_KEY"],
}],
"temperature": 0, # temperature for sampling
"use_cache": True, # whether to use cache
}
# create an AssistantAgent named "assistant"
assistant = autogen.AssistantAgent(
name="coding assistant",
llm_config= gpt35_llm_config,
)
# create a UserProxyAgent instance named "user_proxy"
user_proxy = autogen.UserProxyAgent(
name="user_proxy",
human_input_mode="NEVER",
llm_config= gpt4_llm_config,
max_consecutive_auto_reply=10,
is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
code_execution_config={
"work_dir": "coding",
"use_docker": False, # set to True or image name like "python:3" to use docker
},
) |
@victordibia thanks |
jackgerrits
added a commit
that referenced
this issue
Oct 2, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Suggest to allow user to set different LLM for different agents. For example, GPT4 for coder agent, and GPT3.5-16k for agent that summarizes long articles.
The text was updated successfully, but these errors were encountered: