Skip to content

Conversation

@ekzhu
Copy link
Collaborator

@ekzhu ekzhu commented Oct 31, 2024

Resolves #3969

Would be a simple way to integrate with UIs to show progress while executing a task, without getting into logging. 

from autogen_ext.models import OpenAIChatCompletionClient
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.teams import Swarm
from autogen_agentchat.task import MaxMessageTermination

model_client = OpenAIChatCompletionClient(model="gpt-4o")

agent1 = AssistantAgent(
    "Alice",
    model_client=model_client,
    handoffs=["Bob"],
    system_message="You are Alice and you only answer questions about yourself.",
)
agent2 = AssistantAgent(
    "Bob", model_client=model_client, system_message="You are Bob and your birthday is on 1st January."
)

team = Swarm([agent1, agent2])

stream = team.run_stream("What is bob's birthday?", termination_condition=MaxMessageTermination(3))
async for message in stream:
    print(message)
source='user' content="What is bob's birthday?"
source='Alice' content=[FunctionCall(id='call_OMp2KGOi0HJNoJ1CP4sARKNq', arguments='{}', name='transfer_to_bob')]
source='Alice' content=[FunctionExecutionResult(content='Transferred to Bob, adopting the role of Bob immediately.', call_id='call_OMp2KGOi0HJNoJ1CP4sARKNq')]
source='Alice' target='Bob' content='Transferred to Bob, adopting the role of Bob immediately.'
source='Bob' content='My birthday is on 1st January.'
TaskResult(messages=[TextMessage(source='user', content="What is bob's birthday?"), ToolCallMessage(source='Alice', content=[FunctionCall(id='call_OMp2KGOi0HJNoJ1CP4sARKNq', arguments='{}', name='transfer_to_bob')]), ToolCallResultMessage(source='Alice', content=[FunctionExecutionResult(content='Transferred to Bob, adopting the role of Bob immediately.', call_id='call_OMp2KGOi0HJNoJ1CP4sARKNq')]), HandoffMessage(source='Alice', target='Bob', content='Transferred to Bob, adopting the role of Bob immediately.'), TextMessage(source='Bob', content='My birthday is on 1st January.')])

@ekzhu ekzhu marked this pull request as ready for review October 31, 2024 20:35
@gagb
Copy link
Collaborator

gagb commented Oct 31, 2024

Why is there an error in the execution result of Alice?

@ekzhu
Copy link
Collaborator Author

ekzhu commented Oct 31, 2024

Why is there an error in the execution result of Alice?

This is a separate bug with auto generated handoff functions. The result is actually not used so it does not affect the task.

Fixed.

Copy link
Contributor

@husseinmozannar husseinmozannar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Tested a few times, I'm not sure how I feel about the last message being TaskResult while everything else is consistent as message with source & content fields. Maybe if there is a way to have stream.messages and stream.task_result possibly? this is not necessary and just an optional thing completely

@ekzhu
Copy link
Collaborator Author

ekzhu commented Nov 1, 2024

Looks great! Tested a few times, I'm not sure how I feel about the last message being TaskResult while everything else is consistent as message with source & content fields. Maybe if there is a way to have stream.messages and stream.task_result possibly? this is not necessary and just an optional thing completely

Python current doesn't support returning from async generator. A hack is to return value through exception. https://discuss.python.org/t/allow-return-statements-with-values-in-asynchronous-generators/66886/2

Our client also uses this design to stream completion parts followed by the complete message. https://microsoft.github.io/autogen/dev/user-guide/core-user-guide/framework/model-clients.html#streaming-response

@ekzhu ekzhu merged commit cff7d84 into main Nov 1, 2024
37 checks passed
@ekzhu ekzhu deleted the agentchat-stream branch November 1, 2024 11:12
frances720 added a commit to Promptless/autogen-test that referenced this pull request Nov 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Streaming messages from a run

4 participants