Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming partial messages for UI integration. #3862

Open
jackgerrits opened this issue Oct 21, 2024 · 3 comments
Open

Streaming partial messages for UI integration. #3862

jackgerrits opened this issue Oct 21, 2024 · 3 comments
Labels
needs-design A design needs to be created and agreed upo proj-agentchat size-medium takes up to a week
Milestone

Comments

@jackgerrits
Copy link
Member

jackgerrits commented Oct 21, 2024

We need to ensure the streaming use case is adequately supported in the API. While agent-agent communication likely does not need streaming, we need to ensure agent-UI supports streaming.

User can:

  • Create a UI which can show partial model output
  • Interrupt an in progress streaming response
@jackgerrits jackgerrits added needs-design A design needs to be created and agreed upo enhancement proj-agentchat proj-core labels Oct 21, 2024
@jackgerrits jackgerrits added this to the 0.4 milestone Oct 21, 2024
@fniedtner fniedtner changed the title Streaming support in 0.4 test & create sample for streaming Oct 22, 2024
@ekzhu
Copy link
Collaborator

ekzhu commented Oct 22, 2024

We should use a closure agent to collect the partial messages and pump them through to the front end via a websocket.

The partial message type is application defined.

We can show case this using a cookbook or a sample directory.

@fniedtner fniedtner removed the feature label Oct 24, 2024
@jackgerrits jackgerrits added the size-medium takes up to a week label Oct 24, 2024
@ekzhu
Copy link
Collaborator

ekzhu commented Oct 26, 2024

Related to #3970 and #3969, a simple solution is to just have the agent and team return a streaming iterator.

@ekzhu ekzhu changed the title test & create sample for streaming Streaming partial messages for UI integration. Oct 30, 2024
@ekzhu ekzhu removed the proj-core label Oct 30, 2024
@fniedtner fniedtner modified the milestones: 0.4.0, 0.4.1 Nov 25, 2024
@SuMiaoALi
Copy link

Related to #3970 and #3969, a simple solution is to just have the agent and team return a streaming iterator.与 #3970#3969 相关,一个简单的解决方案是让代理和团队返回一个流式迭代器。

I totally agree!
Agent token streaming output is very important.
In some cases, streaming output may not be needed, such as the interaction between tool_caller and tool_executor;
But when interacting with user agents, it is very necessary. LLM output is streaming and it will be time-consuming. We need to let users see the first token output as soon as possible.
I think the fastest way to achieve this is that any Agent that interacts with UserProxyAgent should return an iterator or generator instead of full chat_messages
So that we can implement streaming output to the UI ourselves.
Looking forward to it soon. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-design A design needs to be created and agreed upo proj-agentchat size-medium takes up to a week
Projects
None yet
Development

No branches or pull requests

4 participants