Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Retrieving Streamed Output from Any Model Node in Dify's Workflow API #12009

Closed
5 tasks done
EmiyaredA opened this issue Dec 23, 2024 · 3 comments
Closed
5 tasks done
Labels
💪 enhancement New feature or request stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed

Comments

@EmiyaredA
Copy link

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

1. Is this request related to a challenge you're experiencing? Tell me about your story.

I would like to suggest that Dify's workflow API support retrieving the streamed output of any model node in the workflow during execution in streaming mode.
Currently, the only way to get the streamed output of a specific model is to add the model node's output variable to the final end node of the workflow.
The reason I’m raising this issue is that, in one of my use cases, I was using conditional nodes to split the workflow into different branches. Each branch called a model, and eventually, all outputs were merged at an aggregation node. In this situation, it became very difficult to access the streamed output of each model node in real time.

2. Additional context or comments

Having the ability to directly retrieve the streamed output from any model node would make the workflow API more flexible and easier to use in cases involving conditional branches or aggregation of multiple streams. This would save developers from needing to add extra steps or workarounds just to collect streamed outputs.

3. Can you help us with this feature?

  • I am interested in contributing to this feature.
@dosubot dosubot bot added the 💪 enhancement New feature or request label Dec 23, 2024
@crazywoola crazywoola added this to the 1.0 milestone Dec 23, 2024
Copy link

dosubot bot commented Jan 23, 2025

Hi, @EmiyaredA. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.

Issue Summary:

  • Enhancement proposal for Dify's workflow API.
  • Aim: Enable retrieval of streamed output from any model node during execution in streaming mode.
  • Current workaround involves adding the model node's output to the final end node.
  • No further comments or activity since the issue was opened.

Next Steps:

  • Is this issue still relevant to the latest version of the Dify repository? If so, please comment to keep the discussion open.
  • If no further activity occurs, this issue will be automatically closed in 15 days.

Thank you for your understanding and contribution!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jan 23, 2025
@toddheslin
Copy link

I can confirm this is still a limitation of the workflow API. A workflow that looks like this is not streamed as I expect:
Image

I do receive a response with text/event-stream and the different workflow events stream out, but the majority of the time is waiting for the LLM prompt to generate. The whole workflow step is sent to the event consumer after it's completed, not as it's being generated.

This differs from running the workflow inside Dify where the LLM stream shows on the screen word by word.

@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jan 23, 2025
Copy link

dosubot bot commented Jan 23, 2025

@takatost, the user has confirmed that the limitation of the workflow API is still relevant, as they are experiencing issues with streaming output during execution. Could you please assist them with this matter?

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
💪 enhancement New feature or request stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed
Projects
None yet
Development

No branches or pull requests

3 participants