Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StreamEvents from langchain js core 0.3.40 not working with python back-end langgraph 0.2.74 #7749

Open
5 tasks done
mfchanou opened this issue Feb 24, 2025 · 4 comments
Open
5 tasks done
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@mfchanou
Copy link

mfchanou commented Feb 24, 2025

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { RemoteRunnable } from "@langchain/core/runnables/remote";


const openai = new RemoteRunnable({
  url: "http://localhost:8001/chat/"
});

const config = {
  configurable: {
    thread_id: "1"
  },
  version: "v2"
};

async function streamMessages() {
  try {


    const eventStream = await openai.streamEvents(
        { messages: [{ role: "user", content: "hello" }] },
        config
      );

    for await (const  event  of eventStream) {
        const eventType = event.event;
        console.log("event type:", eventType)
        
        if (eventType === "on_chat_model_stream") {
            const content = event.data?.chunk?.content;
            if (content !== undefined && content !== "") {
              console.log(`| ${content}`);
            }
        }
        
      }
  } catch (error) {
    console.error("Erreur:", error);
    if (error.cause) console.error("Cause:", error.cause);
    if (error.stack) console.error("Stack:", error.stack);
  }
}

streamMessages(); 

Error Message and Stack Trace (if applicable)

Erreur: TypeError: obj.messages.map is not a function
at revive (file://my-ai-app/node_modules/.pnpm/@langchain[email protected][email protected][email protected][email protected]_/node_modules/@langchain/core/dist/runnables/remote.js:164:40)
at innerRevive (file://my-ai-app/node_modules/.pnpm/@langchain[email protected][email protected][email protected][email protected]_/node_modules/@langchain/core/dist/runnables/remote.js:175:13)

Description

I exposed a python langgraph agent through Langserve. I'm calling this langserve api with the js script above. When calling the streamEvents method, the LLM tokens processing end up failing because the last event which is of type on_chain_end returns a dict with a key messages which is not an array. The Langchain core js method is expecting a dict with a key message which is an array.

System Info

Javascript
Langchain core 0.3.40

Python
Langraph 0.2.74
Langserve 0.3.1

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Feb 24, 2025
@mfchanou mfchanou changed the title StreamEvents from langchain js core 0.3.40 not working with python back-end langraph 0.2.74 StreamEvents from langchain js core 0.3.40 not working with python back-end langgraph 0.2.74 Feb 24, 2025
Copy link

dosubot bot commented Feb 24, 2025

Hello @mfchanou 👋, I'm here to help you with your software development needs. If you have any questions or need assistance with bug fixes, feel free to ask me.

I found a similar closed issue titled "streamEvents from langchain js not working with python back-end" which was resolved by a pull request that added StreamEvents to Remote Runnable [1]. However, the specific changes made in the pull request are not detailed in the available information.

The error you're encountering, "TypeError: obj.messages.map is not a function," suggests that the messages object is not in the expected array format. The revive function in LangChain.js expects the messages object to be an array, as it maps over each message to create a new ChatPromptValue object [2].

To resolve this issue, ensure that the messages object in your input to streamEvents is structured correctly as an array. The expected structure is:

{
  "input": {
    "messages": [
      [
        new SystemMessage("You are Godzilla"),
        new HumanMessage("hello")
      ]
    ]
  }
}

This structure is part of the data object in the event with the event type "on_llm_start" [3]. Make sure that the Python back-end is returning the messages in this expected format.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@mfchanou
Copy link
Author

mfchanou commented Feb 24, 2025

@dosu the output is coming directly from Langgraph itself, I didn't touch it which implies that there is an incompatibility between python langgraph output and the input expected by langchain js.

Here is the langgraph agent python code

from dotenv import load_dotenv
from typing import Annotated

from typing_extensions import TypedDict

from langchain_openai import ChatOpenAI


from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from typing import Annotated, List

from typing_extensions import TypedDict

from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
from langgraph.checkpoint.memory import MemorySaver
from langchain_core.tools import tool
from dataclasses import dataclass
from langchain_community.tools.tavily_search import TavilySearchResults


from fastapi import FastAPI
from langserve import add_routes

from langchain_core.prompts import ChatPromptTemplate

load_dotenv()




@dataclass
class State:
    messages: Annotated[list, add_messages]


llm = ChatOpenAI(model_name="gpt-4o", temperature=0, streaming=True)


prompt = ChatPromptTemplate.from_messages([
    ("system", """
      You're a kubernetes expert. Be clear and concise
    """),
    ("placeholder", "{messages}"),

])


tool = TavilySearchResults(max_results=2)
tools = [tool]

llm_with_tools = llm.bind_tools(tools)
chain = prompt | llm_with_tools
graph_builder = StateGraph(State)

def chatbot_node(state: State):

    return {"messages": chain.invoke({"messages": state.messages}) }


graph_builder.add_node("chatbot", chatbot_node)
graph_builder.add_node("tools", ToolNode(tools))

graph_builder.add_edge(START, "chatbot")
graph_builder.add_conditional_edges(
    "chatbot",
    tools_condition,
)

graph_builder.add_edge("tools", "chatbot")


memory = MemorySaver()
graph = graph_builder.compile(checkpointer=memory)



app = FastAPI(
    title="My API",
    version="1.0",
    description="my API"
)

add_routes(
    app,
    graph,
    path="/chat",
    config_keys=["configurable"],
    playground_type="chat"
)

Copy link

dosubot bot commented Feb 24, 2025

The issue you're encountering seems to stem from an incompatibility between the output format of the Langgraph Python code and the expected input format for the LangChain JS streamEvents method. The error message indicates that the messages object is not in the expected array format.

In your Python code, the chatbot_node function returns a dictionary with a key messages, which is the result of chain.invoke({"messages": state.messages}). To ensure compatibility, you need to verify that the output of chain.invoke is structured as an array of messages, as expected by LangChain JS.

Here's a potential solution:

  1. Check the Output Format: Ensure that the chain.invoke method in your Python code returns the messages in the correct format. The expected format for LangChain JS is an array of message objects, each with a role and content.

  2. Modify the Output if Necessary: If the output is not in the expected format, you may need to adjust the chatbot_node function to transform the output into an array of message objects. For example:

    def chatbot_node(state: State):
        result = chain.invoke({"messages": state.messages})
        # Ensure the result is a list of message objects
        if isinstance(result, dict) and 'messages' in result:
            messages = result['messages']
            if not isinstance(messages, list):
                messages = [messages]  # Wrap in a list if not already
            return {"messages": messages}
        return {"messages": []}  # Return an empty list if the format is incorrect
  3. Test the Integration: After making these changes, test the integration again to see if the error is resolved.

By ensuring that the output from the Python backend matches the expected input format for the JavaScript frontend, you should be able to resolve the incompatibility issue.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@mfchanou
Copy link
Author

mfchanou commented Feb 24, 2025

@dosu still the same issue. After checking it seems that it's the event causing trouble it's an on_chain_start event which occurs after an on_chain_end. As you can see in the payload, the messages key in the payload is not an array. Another question is why we have an on_chain_start event after an on_chain_end?

{"event":"on_chain_start","data":{"input":{"messages":{"content":"Bonjour ! Comment puis-je vous aider aujourd'hui avec vos questions ou besoins en crypto-monnaies ?","additional_kwargs":{},"response_metadata":{"finish_reason":"stop","model_name":"gpt-4o-2024-08-06","system_fingerprint":"fp_eb9dce56a8"},"type":"ai","name":null,"id":"run-53b93eb2-a2cb-4b76-a106-89845d71328e","example":false,"tool_calls":[],"invalid_tool_calls":[],"usage_metadata":null}}},"name":"_write","tags":["seq:step:2","langsmith:hidden"],"run_id":"289df01a-d374-4523-ba19-8ce30e1a1684","metadata":{"checkpoint_ns":"","thread_id":"1","langgraph_step":7,"langgraph_node":"chatbot","langgraph_triggers":["start:chatbot"],"langgraph_path":["__pregel_pull","chatbot"],"langgraph_checkpoint_ns":"chatbot:3eb7a61a-c97e-28dc-f86d-f08467df742c"},"parent_ids":["3425d66e-6351-48bd-aeeb-80dc9ba77a29","7c5921dd-e61c-472e-93d7-ff225deff137"]}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant