-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for RunnableGenerator
#4936
Comments
Hey @juanbercoff, good to see you back around here! Hope you're doing well. Thank you for providing a detailed description of the issue you're encountering with Given the context and the code snippet you provided, there are a few potential reasons for this behavior:
To address this issue, you might need to:
You can find more information about how I hope this information helps you understand the potential causes of the issue and possible steps to resolve it. If you have any further questions or need more clarification, please don't hesitate to ask. Sources
|
My guess is that the Anthropic model isn't returning JSON or the parser is otherwise failing to parse the output. Can you share what output you're seeing from the model? |
Hi Jacob, When running the Anthropic model with the JSON parser and not nested it works as expected, it outputs a json. The issue occurs when nesting as shown on the example. |
This is still happening and now is also happening with openai model |
Hey, sorry I have just reproed the issue and will look into it. |
Ah, no this is intended. The issue is the final step in your {
output: (i) => {
console.log('i', i)
return 'output'
},
}, Because There's another construct that isn't in LangChain.js yet that would support what you want to do - will keep this issue open and look into adding it. |
RunnableGenerator
Hi Jacob, |
Really? Are you sure you have the final step as that custom function in both cases? It doesn't make sense to repeatedly call a function that doesn't return an iterator with chunks so that might be a bug on our end if that's the case. |
I'll try to provide another reproduction example as soon as I can. |
Hello 👋, I am part of a group of University of Toronto students looking to investigate this issue. Have you had this issue with any other output parsers other than JSONOutputParser? Was there any specific changes to your chain or the nesting that caused the OpenAI chat model to produce this issue despite it originally working as you intended? |
We changed most of our architecture recently and ended up writing our own output parser. So tbh I don't know if this issue is still reproducible with the latest updates. |
Hi, @juanbercoff. I'm Dosu, and I'm helping the LangChain JS team manage their backlog. I'm marking this issue as stale. Issue Summary:
Next Steps:
Thank you for your understanding and contribution! |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
No response
Description
When using the ChatAnthropic model and nesting RunnableSequence with a JsonOutputParser the expected input of the output chain should be
{ else: 'else', runnableSequence: //some output }
, instead only one property appears in the log. This does not happens when using ChatOpenAI for example.This only happens when streaming the final chain, if I invoke the chain both of the outputs appears.
System Info
[email protected]
windows
node v20.11.0
pnpm 8.3.1
The text was updated successfully, but these errors were encountered: