Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for RunnableGenerator #4936

Closed
5 tasks done
juanbercoff opened this issue Mar 30, 2024 · 12 comments · May be fixed by #7691
Closed
5 tasks done

Add support for RunnableGenerator #4936

juanbercoff opened this issue Mar 30, 2024 · 12 comments · May be fixed by #7691
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature help wanted This would make a good PR

Comments

@juanbercoff
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

  const model = new ChatAnthropic()
    const template = PromptTemplate.fromTemplate(
      'Tell me a joke about {topic}. Respond with a JSON'
    )
    const runnableSequence = RunnableSequence.from([template, model, new JsonOutputParser()])

    const chain = RunnableSequence.from([
      {
        topic: (i) => i,
      },
      {
        runnableSequence,
        else: () => 'else',
      },
      {
        output: (i) => {
          console.log('i', i)
          return 'output'
        },
      },
    ])

Error Message and Stack Trace (if applicable)

No response

Description

When using the ChatAnthropic model and nesting RunnableSequence with a JsonOutputParser the expected input of the output chain should be { else: 'else', runnableSequence: //some output }, instead only one property appears in the log. This does not happens when using ChatOpenAI for example.

This only happens when streaming the final chain, if I invoke the chain both of the outputs appears.

System Info

[email protected]
windows
node v20.11.0
pnpm 8.3.1

Copy link

dosubot bot commented Mar 30, 2024

Hey @juanbercoff, good to see you back around here! Hope you're doing well.

Thank you for providing a detailed description of the issue you're encountering with RunnableMap when using ChatAnthropic and JsonOutputParser. This seems to be a complex issue that might be related to how these services handle or format their outputs, especially when tools or functions are involved in the responses.

Given the context and the code snippet you provided, there are a few potential reasons for this behavior:

  1. Output Formatting Differences: ChatAnthropic and ChatOpenAI might format their outputs differently, especially when tools or functions are involved. This difference in formatting could affect how JsonOutputToolsParser parses the output. If ChatAnthropic's output doesn't match the expected format or includes additional metadata that JsonOutputToolsParser doesn't account for, it might result in missing or incorrectly parsed outputs.

  2. Tool Invocation Handling: The way ChatAnthropic handles the invocation of tools within the text might differ from ChatOpenAI. This difference could affect the output in a way that JsonOutputToolsParser can't correctly parse it. For example, if ChatAnthropic includes additional information or uses a different structure to indicate the tool's output, the parser might not recognize or correctly extract the relevant data.

  3. Model Differences: The modelName specified (claude-3-sonnet-20240229) might handle the structured output differently than models used with ChatOpenAI. This could inherently affect how data is structured in the output, impacting the parser's ability to extract the expected information.

  4. Parser Expectations: JsonOutputToolsParser might have specific expectations about the output's structure that are met by ChatOpenAI but not by ChatAnthropic. This could be due to assumptions made during the parser's development about how data is presented in the output.

To address this issue, you might need to:

  • Inspect the Raw Output: Directly inspect the raw output from ChatAnthropic to identify any differences in structure or formatting compared to ChatOpenAI.
  • Adjust the Parser: Modify JsonOutputToolsParser to handle the specific output format or structure provided by ChatAnthropic, ensuring it can correctly parse and extract the necessary information.

You can find more information about how RunnableMap handles the outputs from different services in the LangChain JS codebase in the RunnableMap test file. This file contains tests that demonstrate how RunnableMap can handle parallel processing of different services and then aggregate their outputs.

I hope this information helps you understand the potential causes of the issue and possible steps to resolve it. If you have any further questions or need more clarification, please don't hesitate to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Mar 30, 2024
@jacoblee93
Copy link
Collaborator

My guess is that the Anthropic model isn't returning JSON or the parser is otherwise failing to parse the output. Can you share what output you're seeing from the model?

@juanbercoff
Copy link
Author

Hi Jacob,

When running the Anthropic model with the JSON parser and not nested it works as expected, it outputs a json. The issue occurs when nesting as shown on the example.

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jul 8, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Jul 15, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jul 15, 2024
@juanbercoff
Copy link
Author

This is still happening and now is also happening with openai model

@dosubot dosubot bot reopened this Jul 21, 2024
@jacoblee93
Copy link
Collaborator

jacoblee93 commented Jul 23, 2024

Hey, sorry I have just reproed the issue and will look into it.

@jacoblee93
Copy link
Collaborator

jacoblee93 commented Jul 23, 2024

Ah, no this is intended. The issue is the final step in your RunnableMap:

    {
      output: (i) => {
        console.log('i', i)
        return 'output'
      },
    },

Because output is a function there, it becomes a RunnableLambda, which will gather all steps from the previous step before invoking the function.

There's another construct that isn't in LangChain.js yet that would support what you want to do - will keep this issue open and look into adding it.

https://github.com/langchain-ai/langchain/blob/master/libs/core/langchain_core/runnables/base.py#L3757

@jacoblee93 jacoblee93 changed the title RunnableMap does not return one of the outputs when streamed with ChatAnthropic and inner chain uses JsonOutputParser Add support for RunnableGenerator Jul 23, 2024
@jacoblee93 jacoblee93 added the help wanted This would make a good PR label Jul 23, 2024
@juanbercoff
Copy link
Author

Hi Jacob,
But why is it working with the StringOutputParser and not with the JSONOutputParser? Using our own custom parser also works.

@jacoblee93
Copy link
Collaborator

Really? Are you sure you have the final step as that custom function in both cases?

It doesn't make sense to repeatedly call a function that doesn't return an iterator with chunks so that might be a bug on our end if that's the case.

@juanbercoff
Copy link
Author

I'll try to provide another reproduction example as soon as I can.

@CarterMorris
Copy link
Contributor

Hello 👋, I am part of a group of University of Toronto students looking to investigate this issue. Have you had this issue with any other output parsers other than JSONOutputParser? Was there any specific changes to your chain or the nesting that caused the OpenAI chat model to produce this issue despite it originally working as you intended?

@juanbercoff
Copy link
Author

We changed most of our architecture recently and ended up writing our own output parser. So tbh I don't know if this issue is still reproducible with the latest updates.

Copy link

dosubot bot commented Jan 20, 2025

Hi, @juanbercoff. I'm Dosu, and I'm helping the LangChain JS team manage their backlog. I'm marking this issue as stale.

Issue Summary:

  • The issue involves a bug with the ChatAnthropic model in LangChain.js when used with RunnableSequence and JsonOutputParser.
  • The expected output structure is not logged correctly, potentially due to output formatting differences and parser expectations.
  • You noted that the issue also occurs with the OpenAI model and mentioned changes in your architecture, making it unclear if the issue persists.
  • @CarterMorris expressed interest in investigating the issue further.

Next Steps:

  • Please let us know if this issue is still relevant to the latest version of the LangChain JS repository by commenting on this issue.
  • If there is no further activity, the issue will be automatically closed in 7 days.

Thank you for your understanding and contribution!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jan 20, 2025
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Jan 27, 2025
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jan 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature help wanted This would make a good PR
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants