-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Struggling with predefined functions in group chat. #152
Comments
Could you try adding to the |
Hey, thanks, tried that and still no luck
|
Why predefined function call doesn't work in group chatThe predefined function doesn't work in group chat because in autogen, a In two-agent chat, where agents take turns to speak, the While in group chat, where a dynamic chat flow applies, the original trigger way for Work-around: How to fix it.@ilaffey2 (I hope I @ the right person) has an excellent solution where you can use to override the from autogen import GroupChat, ConversableAgent, UserProxyAgent
from dataclasses import dataclass
@dataclass
class ExecutorGroupchat(GroupChat):
dedicated_executor: UserProxyAgent = None
def select_speaker(
self, last_speaker: ConversableAgent, selector: ConversableAgent
):
"""Select the next speaker."""
try:
message = self.messages[-1]
if "function_call" in message:
return self.dedicated_executor
except Exception as e:
print(e)
pass
selector.update_system_message(self.select_speaker_msg())
final, name = selector.generate_oai_reply(
self.messages
+ [
{
"role": "system",
"content": f"Read the above conversation. Then select the next role from {self.agent_names} to play. Only return the role.",
}
]
)
if not final:
# i = self._random.randint(0, len(self._agent_names) - 1) # randomly pick an id
return self.next_agent(last_speaker)
try:
return self.agent_by_name(name)
except ValueError:
return self.next_agent(last_speaker) |
Hey, yeah I tried this solution as it was recommended on Discord. But there seems to be a bug.
assistant (to chat_manager): ***** Suggested function Call: read_from_file ***** AttributeError: 'NoneType' object has no attribute 'generate_reply Though this could be just me not using the class correctly. |
Can you share the complete code? From the code snippet you share,, it seems that you didn't pass the groupchat = ExecutorGroupchat(
agents=[user_proxy, assistant, architect], messages=[],max_round=20) |
Sure, but I will need to leave out some prompts as it's work related.
If you have any suggestions on what I'm missing, please let me know, this is all relatively new for me. |
Try passing the from
to
|
YES! This is it. Thank you so much. I hope this gets pulled into autogen. |
Cool, glad to see you are unblocked! |
quick question, with this dedicated executor, will this mess up the chat manager to hand off tasks to other agents? In my tests, it ends after the first agent completes it's job, but it should be going back to the chat manager, so the next task is completed by the second agent. |
If you have chat log to share that would be helpful. Otherwise I would need to run your code tmr to take a closer look |
You know what, not to worry, this was me being hasty without fully testing. This is truly next gen, using this function call + generation method, I can add functionality to our GPT4 knowledge base chat bot that's already live. Really appreciate the help from all. If we can get that executor class into the main autogen package, I think this can really help simplify certain scenarios. If I can follow up with a small, slightly off topic question... how can we record the logs to JSON without redirecting stdout? I saw the documentation mentioned, but it's not super clear. Could you provide an example of where I can record the agent conversion logs to an output JSON file? |
I actually not quite familiar on how with open(output_txt, "w", encoding='utf-8') as f:
for message in groupchat.messages:
# write a seperator
f.write("-" * 20 + "\n")
f.write(f'''###
{message["name"]}
###''' + "\n")
f.write(message["content"] + "\n")
f.write("-" * 20 + "\n")
except Exception as e:
raise e |
@nubgamerz do you want to save the log after the chat finishes or on the fly? After finishing, you can use @LittleLittleCloud 's solution to dump the messages |
Hope this issue will be fixed soon and updated in the next release |
@LittleLittleCloud @nubgamerz do you plan to make a PR to add the solution to the library? The new agents can be added to |
The |
That's why I suggest putting it under contrib/. Also we should document that limitation clearly. |
Should be @ilaffey2? |
@ilaffey2 are you interested in making a PR? |
This is not my work, so it would be unproper for me to create a PR for this. |
@nubgamerz a quick question. Are you using local LLM? If yes, can you suggest which model is compatible with Autogen and predefined function feature? I am using CodeLlama, but it cannot recognize predefined functions. |
I want to have LLM's access to predefined functions.
I've written those functions, and it works when there's no group chat.
llm_config = { "functions": [ { "name": "write_to_file", "description": "Use this function to write content to a file", "parameters": { "type": "object", "properties": { "filename": { "type": "string", "description": "The filename to write to", }, "content": { "type": "string", "description": "The content to write", } }, "required": ["filename", "content"], }, }, { "name": "read_from_file", "description": "Use this function to read the content of a file", "parameters": { "type": "object", "properties": { "filename": { "type": "string", "description": "The filename to read from", } }, "required": ["filename"], }, }, { "name": "read_pdf", "description": "Use this function to read the content of a pdf file", "parameters": { "type": "object", "properties": { "filename": { "type": "string", "description": "The filename to read from", } }, "required": ["filename"], }, }, { "name": "create_directory", "description": "Use this function to create a directory", "parameters": { "type": "object", "properties": { "directory_path": { "type": "string", "description": "The directory path to create", } }, "required": ["directory_path"], }, }, ], "config_list": config_list, "seed": 42, "request_timeout": 120 }
user_proxy = UserProxyAgent( name="user_proxy", system_message="A human that will provide the necessary information to the assistant", function_map={ "write_to_file": write_to_file, "read_from_file": read_from_file, "read_pdf": read_pdf, "create_directory": create_directory, }, code_execution_config={"work_dir": "fileread"})
assistant = AssistantAgent( name="assistant", system_message="""You are an assistant, you must blah blah""", llm_config=llm_config )
When initiating this, it works flawlessly. Functions are used properly.
But if I add another llm and a group chat
architect = AssistantAgent( name="architect", system_message="""You are a blah blah, and will get info from assistant etc etc""", llm_config=llm_config, )
groupchat = GroupChat( agents=[user_proxy, assistant, architect], messages=[],)
manager = GroupChatManager(groupchat=groupchat, llm_config=llm_config)
If I now initiate the chat to the manager, the assistant agent says the functions cannot be found
`assistant (to chat_manager):
***** Response from calling function "read_from_file" *****
Error: Function read_from_file not found.
***********************************************************`
Am I missing something here? Do I need to define functions somehow in the group chat? Because if I remove the manager and the group chat, it works fine.
The text was updated successfully, but these errors were encountered: