-
Notifications
You must be signed in to change notification settings - Fork 965
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LangGraph Studio without Human Message ID #1805
Comments
Hi there! What does your State schema look like? If you're using add_messages, that reducer will typically ensure your message has an ID. I agree it makes sense to generate an ID for messages originating from the studio. cc @dqbd |
Yes, here more context: class GraphConfig(BaseModel):
"""
Initial configuration to trigger the AI system.
Attributes:
- qa_model: Select the model for the LLM. Options include 'openai', 'google', 'meta', or 'amazon'.
- system_prompt: Select the prompt of your conversation.
- temperature: Select the temperature for the model. Options range from 0 to 1.
- using_summary_in_memory: If you want to summarize previous messages, place True. Otherwise, False.
"""
qa_model: Literal[*AVAILABLE_MODELS]
system_prompt: Literal[*CUSTOM_PROMPTS.keys()]
temperature: float = Field(ge=0, le=1)
using_summary_in_memory: bool = False
@validator("temperature")
def check_temperature(cls, temperature: float):
if (temperature < 0.0) | (temperature > 1.0):
raise ValueError("Temperature should be between 0 and 1")
return temperature
class State(TypedDict):
messages: Annotated[List[AnyMessage], operator.add]
summary: str
class GraphInput(TypedDict):
"""
The initial message that starts the AI system
"""
messages: Annotated[List[AnyMessage], operator.add]
class GraphOutput(TypedDict):
"""
The output of the AI System
"""
messages: List[AnyMessage]
workflow = StateGraph(State,
input = GraphInput,
output = GraphOutput,
config_schema = GraphConfig) |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I developed a simple chatbot system with a memory summarisation feature, but when I test it on LangGraph Studio, I see that the Human Message generated in the UI of LangGraph Studio doesn´t generate an ID. I think it should generate one randomly automatically.
Then, when I implement a summary of the memory, the RemoveMessage fails because the Human Message doesn´t have an associated ID.
Video sample:
My.Movie.mp4
Here the state info:
System Info
langchain_community
python-dotenv
langchain_openai
langchain_core
langchain_google_vertexai
langchain
streamlit
langchain-google-genai
langchain-anthropic
langchain-groq
transformers
langgraph
langchain-aws
The text was updated successfully, but these errors were encountered: