Skip to content

Commit

Permalink
feature: get messages from messages table for the playground (langflo…
Browse files Browse the repository at this point in the history
…w-ai#3874)

* refactor: Update MessageBase text attribute based on isinstance check.

* feat: Add update_message function to update a message in the database.

* refactor(chat): Update imports and remove unnecessary config method in ChatComponent.

* refactor: Add stream_message method to ChatComponent.

* refactor: Update method call in ChatOutput component.

* feat: Add callback function to custom component and update build_results signature.

* feat: Add callback parameter to instantiate_class function.

* feat(graph): Add callback functions for sync and async operations.

* feat: Add callback function support to vertex build process.

* feat: Add handling for added message in InterfaceVertex class.

* feat: Add callback support to Graph methods.

* feat(chat): Add callback function to build_vertices function.

* refactor: Simplify update_message function and use session_scope for session management.

* fix: Call set_callback method if available on custom component.

* refactor(chat): Update chat message chunk handling and ID conversion.

* feat: Add null check before setting cache in build_vertex_stream function.

* refactor: Fix send_event_wrapper function and add callback parameter to _build_vertex function.

* refactor: Simplify conditional statement and import order in ChatOutput.

* [autofix.ci] apply automated fixes

* refactor: move log method to Component class.

* refactor: Simplify CallbackFunction definition.

* feat: Initialize _current_output attribute in Component class.

* feat: store current output name in custom component during processing.

* feat: Add current output and component ID to log data.

* fix: Add condition to check current output before invoking callback.

* refactor: Update callback to log_callback in graph methods.

* feat: Add test for callback graph execution with log messages.

* update projects

* fix(chat.py): fix condition to check if message text is a string before updating message text in the database

* refactor(ChatOutput.py): update ChatOutput class to correctly store and assign the message value to ensure consistency and avoid potential bugs

* refactor(chat.py): update return type of store_message method to return a single Message object instead of a list of Messages
refactor(chat.py): update logic to correctly handle updating and returning a single stored message object instead of a list of messages

* update starter projects

* refactor(component.py): update type hint for name parameter in log method to be more explicit

* feat: Add EventManager class for managing events and event registration

* refactor: Update log_callback to event_manager in custom component classes

* refactor(component.py): rename _log_callback to _event_manager and update method call to on_log for better clarity and consistency

* refactor(chat.py): rename _log_callback method to _event_manager.on_token for clarity and consistency in method naming

* refactor: Rename log_callback to event_manager for clarity and consistency

* refactor: Update Vertex class to use EventManager instead of log_callback for better clarity and consistency

* refactor: update build_flow to use EventManager

* refactor: Update EventManager class to use Protocol for event callbacks

* if event_type is not passed, it uses the default send_event

* Add method to register event functions in EventManager

- Introduced `register_event_function` method to allow passing custom event functions.
- Updated `noop` method to accept `event_type` parameter.
- Adjusted `__getattr__` to return `EventCallback` type.

* update test_callback_graph

* Add unit tests for EventManager in test_event_manager.py

- Added tests for event registration, including default event type, empty string names, and specific event types.
- Added tests for custom event functions and unregistered event access.
- Added tests for event sending, including JSON formatting, empty data, and large payloads.
- Added tests for handling JSON serialization errors and the noop function.

* feat: Add callback function support to vertex build process.

* feat: Add callback support to Graph methods.

* feat(chat): Add callback function to build_vertices function.

* [autofix.ci] apply automated fixes

* refactor: Update callback to log_callback in graph methods.

* fetching data from messages and builds at the same time, need to remove duplicates

* refactor: Sort chat history by timestamp in ChatView component

* fix: update serialization and improve error handling (langflow-ai#3516)

* feat(utils): add support for V1BaseModel in serialize_field

Add support for V1BaseModel instances in the serialize_field function by
checking for a "to_json" method. If the method is not present, return the
attribute values as a dictionary.

* refactor: Update field serializer function and error handling in build_flow function

* remove use memo to prevent bugs

* feat: add updateMessagePartial method to MessagesStoreType

* feat: update message partially in MessagesStoreType

This commit adds the `updateMessagePartial` method to the `MessagesStoreType` in `messagesStore.ts`. This method allows updating a specific message by merging the changes with the existing message object.

* feat: add log callback for start message in ChatComponent

* feat: update log_callback name

* feat: add log_callback for message in ChatComponent that are not streaming

* refactor: remove console.log statement in buildFlowVertices function

* refactor: store message in ChatInput after updating flow_id

This commit refactors the `ChatInput` component by moving the logic to store the message after updating the `flow_id` property. This ensures that the message is properly stored in the correct flow. The previous implementation had the logic to store the message before updating the `flow_id`, which could lead to incorrect storage of messages. This change improves the reliability and accuracy of message storage in the `ChatInput` component.

* refactor: move message storage logic in ChatInput after updating flow_id

* refactor: update ChatComponent to use stored_message.id instead of self.graph.flow_id

Update the `ChatComponent` class in `chat.py` to use the `stored_message.id` property instead of `self.graph.flow_id` when logging a message. This ensures that the correct message ID is used for logging purposes. The previous implementation used the flow ID, which could lead to incorrect logging. This change improves the accuracy of message logging in the `ChatComponent`.

* refactor: remove unused code and console.log statements

* raw: temp serializer fix

* streaming working but the message comes in one shot

* refactor: optimize message update in useMessagesStore

Improve the efficiency of updating messages in the `useMessagesStore` function of `messagesStore.ts`. Instead of iterating through the entire message list, this refactor searches for the message to update by iterating backwards from the end. This approach allows for faster message retrieval and update. The code has been modified to use a for loop and break out of the loop once the message is found. This change enhances the performance of the message update process.

* Refactor `serialize_flow_id` method to correctly handle UUID serialization in `message.py`

* Refactor `send_event` method to use `jsonable_encoder` for data serialization

* refactor: optimize message update in useMessagesStore

* streaming working with timeout

* refactor: update buildUtils.ts to use data instead of data.data in addMessage function

* version with reactState for chatHistory

* refactor: update on_message method in ChatComponent

* refactor: update on_message method in ChatComponent

* refactor: Remove unused dependency in package-lock.json

* Refactor chatView component and add hiddenSession prop

* Refactor chatView component and update hiddenSessions prop

* Refactor chatView component to use visibleSessions prop instead of hiddenSessions

* Refactor IOModal component to remove redundant code

* Refactor chatView component to include focusChat prop

* Refactor chatView component to include focusChat prop and trigger focus on chat when new session is set

* Refactor IOModal component to update visible sessions when new session is added

* feat: Add session parameter to buildFlowVertices function

* feat: Add someFlowTemplateFields function

Add the someFlowTemplateFields function to the reactflowUtils module. This function checks if any of the nodes in the provided array have template fields that pass a given validation function.

* feat: Add session parameter to buildFlowVertices function

* feat: Add session parameter to buildFlowVertices function

* update Session logic on ioModal

* Refactor ChatView component: Remove unused eraser button

The eraser button in the ChatView component was removed as it was not being used and served no purpose. This change improves code cleanliness and removes unnecessary code.

* Refactor Vertex class: Inject session_id if provided in inputs

* Refactor build_flow function: Set default session if inputs are empty

* Refactor InputValueRequest schema: Add session parameter

* Refactor IOModal component: Update session logic

* Refactor buildFlowVertices function: Update input handling

* Refactor MessagesStoreType in zustand/messages/index.ts: Remove unused columns property and setColumns method

* Refactor MessagesStoreType: Remove unused columns property and setColumns method

* Refactor SessionView component: Update columns extraction logic

* Refactor ChatView component: Remove unused variables

* Refactor useGetMessagesQuery: Remove unused setColumns method

* Refactor RenderIcons component: Set default value for filteredShortcut prop to prevent bug

* create edit message component for chat view

* Refactor useUpdateMessage: Add refetch option to trigger query refetch

* Refactor IOModal component: Remove unused variables and update useGetMessagesQuery

* Refactor ChatView component: Add session ID to message object

* update chat message to handle message edit

* update types

* fix: Update API call to send entire message object

* Refactor EditMessageField component: Add timeout to onBlur event

* Refactor EditMessageField component: Update layout of edit message field

* create migration

* add fields to data table

* feat: Add "edit" flag to message_dict in update_message API endpoint

* Refactor EditMessageField component: Improve onBlur event handling and add button click flag

* Refactor code to include "edit" flag in message types

* feat: Add EditMessageButton component for editing chat messages

* Refactor ChatMessage component: Add EditMessageButton and improve layout

* fix: Add refetch query for current flow messages not all flows

* Refactor ChatMessage component: Add ShadTooltip for EditMessageButton

* add info into edit message field

* fix: migrate

* fix running chat input directly from the flow

* [autofix.ci] apply automated fixes

* fix edit flag

* Refactor IOModal component to generate a unique session ID based on the current date and time

* [autofix.ci] apply automated fixes

* Refactor IOModal component to improve session management and interaction

* [autofix.ci] apply automated fixes

* Refactor sessionSelector component to improve session management and interaction

* chore: Refactor sessionSelector component to improve session management and interaction

* [autofix.ci] apply automated fixes

* create mutation to handle session rename

* refactor: Rename useUpdateSession to useUpdateSessionName for clarity

* [autofix.ci] apply automated fixes

* Refactor sessionSelector component for improved session management and interaction

* Refactor sessionSelector component to update visible session on session name change

* [autofix.ci] apply automated fixes

* add message related events back

* chore: Add console logs for debugging in buildFlowVertices function

* Refactor IOModal component to update tab trigger label from "Memories" to "Chat"

* improve edit name feature

* Refactor IOModal component button label to "New Chat"

* Refactor sessionSelector component to improve session management and interaction

* Refactor IOModal component to remove unused code and improve session management

* fix typing error

* fix run chat input on component level

* prevent toogle visibility on session menu

* fix bug on rename session while in table view mode

* chore: Update setSelectedView prop type in sessionSelector component

* add first test version not working yet

* fix bug for renaming and deleting session

* refactor: Update sessionSelector component to handle session changes

* improve test

* fix rename session multiple session bugs

* change visible session from array to string

* chore: Update editMessageField component to include margin-right for text span

* [autofix.ci] apply automated fixes

* Update down_revision in Alembic migration script

* Refactor IOModal component to simplify session visibility handling

* Fix comparison operator for filtering error messages in memory.py

* Refactor ChatInput to conditionally store and update messages

* Refactor JSON formatting for improved readability in starter projects

* Add type casting for message_text and import cast from typing module

* Refactor input handling to use direct dictionary access for 'session' and 'input_value' keys

* Allow `update_message` to accept `str` type for `message_id` parameter

* ⬆️ (pyproject.toml): upgrade duckduckgo-search dependency to version 6.3.1 for bug fixes or new features
🔧 (duckduckgo.spec.ts): refactor test to handle multiple possible outcomes when waiting for selectors and improve readability

* Refactor test file: generalBugs-shard-0.spec.ts

* Refactor test file: freeze.spec.ts

* Refactor test files: update element selectors and actions

* Refactor test file: chatInputOutput.spec.ts

* [autofix.ci] apply automated fixes

* Refactor chatMessage component to handle different types of children content on code modal

* [autofix.ci] apply automated fixes

---------

Co-authored-by: Gabriel Luiz Freitas Almeida <[email protected]>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: italojohnny <[email protected]>
Co-authored-by: cristhianzl <[email protected]>
  • Loading branch information
5 people authored and diogocabral committed Nov 26, 2024
1 parent 46ab673 commit 7be4ebf
Show file tree
Hide file tree
Showing 58 changed files with 3,781 additions and 1,310 deletions.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ dependencies = [
"jq>=1.8.0",
"pydantic-settings==2.4.0",
"ragstack-ai-knowledge-store>=0.2.1",
"duckduckgo-search>=6.3.0",
"duckduckgo-search>=6.3.1",
"langchain-elasticsearch>=0.2.0",
"opensearch-py>=2.7.1",
]
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
"""Add error and edit flags to message
Revision ID: eb5e72293a8e
Revises: 5ace73a7f223
Create Date: 2024-09-19 16:18:50.828648
"""

from typing import Sequence, Union

import sqlalchemy as sa
from alembic import op
from sqlalchemy.engine.reflection import Inspector

# revision identifiers, used by Alembic.
revision: str = "eb5e72293a8e"
down_revision: Union[str, None] = "5ace73a7f223"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names() # noqa
column_names = [column["name"] for column in inspector.get_columns("message")]
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("message", schema=None) as batch_op:
if "error" not in column_names:
batch_op.add_column(sa.Column("error", sa.Boolean(), nullable=False, server_default=sa.false()))
if "edit" not in column_names:
batch_op.add_column(sa.Column("edit", sa.Boolean(), nullable=False, server_default=sa.false()))

# ### end Alembic commands ###


def downgrade() -> None:
conn = op.get_bind()
inspector = Inspector.from_engine(conn) # type: ignore
table_names = inspector.get_table_names() # noqa
column_names = [column["name"] for column in inspector.get_columns("message")]
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table("message", schema=None) as batch_op:
if "edit" in column_names:
batch_op.drop_column("edit")
if "error" in column_names:
batch_op.drop_column("error")

# ### end Alembic commands ###
3 changes: 3 additions & 0 deletions src/backend/base/langflow/api/v1/chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -155,6 +155,9 @@ async def build_flow(
telemetry_service: TelemetryService = Depends(get_telemetry_service),
session=Depends(get_session),
):
if not inputs:
inputs = InputValueRequest(session=str(flow_id))

async def build_graph_and_get_order() -> tuple[list[str], list[str], Graph]:
start_time = time.perf_counter()
components_count = None
Expand Down
1 change: 1 addition & 0 deletions src/backend/base/langflow/api/v1/monitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,6 +99,7 @@ async def update_message(

try:
message_dict = message.model_dump(exclude_unset=True, exclude_none=True)
message_dict["edit"] = True
db_message.sqlmodel_update(message_dict)
session.add(db_message)
session.commit()
Expand Down
4 changes: 4 additions & 0 deletions src/backend/base/langflow/api/v1/schemas.py
Original file line number Diff line number Diff line change
Expand Up @@ -297,6 +297,7 @@ class VerticesBuiltResponse(BaseModel):
class InputValueRequest(BaseModel):
components: list[str] | None = []
input_value: str | None = None
session: str | None = None
type: InputType | None = Field(
"any",
description="Defines on which components the input value should be applied. "
Expand All @@ -310,9 +311,12 @@ class InputValueRequest(BaseModel):
{
"components": ["components_id", "Component Name"],
"input_value": "input_value",
"session": "session_id",
},
{"components": ["Component Name"], "input_value": "input_value"},
{"input_value": "input_value"},
{"components": ["Component Name"], "input_value": "input_value", "session": "session_id"},
{"input_value": "input_value", "session": "session_id"},
{"type": "chat", "input_value": "input_value"},
{"type": "json", "input_value": '{"key": "value"}'},
]
Expand Down
2 changes: 1 addition & 1 deletion src/backend/base/langflow/base/astra_assistants/util.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ def get_patched_openai_client(shared_component_cache):
data = json.loads(response.text)

# Extract the model names into a Python list
litellm_model_names = [model for model, _ in data.items() if model != "sample_spec"]
litellm_model_names = [model for model in data if model != "sample_spec"]


# To store the class names that extend ToolInterface
Expand Down
96 changes: 54 additions & 42 deletions src/backend/base/langflow/base/io/chat.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from collections.abc import AsyncIterator, Iterator
from typing import cast

from langflow.custom import Component
from langflow.memory import store_message
Expand All @@ -12,43 +13,52 @@ class ChatComponent(Component):
display_name = "Chat Component"
description = "Use as base for chat components."

# Keep this method for backward compatibility
def store_message(
self,
message: Message,
) -> Message:
messages = store_message(
message,
flow_id=self.graph.flow_id,
)
if len(messages) > 1:
def store_message(self, message: Message) -> Message:
messages = store_message(message, flow_id=self.graph.flow_id)
if len(messages) != 1:
msg = "Only one message can be stored at a time."
raise ValueError(msg)

stored_message = messages[0]
if (
hasattr(self, "_event_manager")
and self._event_manager
and stored_message.id
and not isinstance(message.text, str)
):
self._send_message_event(stored_message)

if self._should_stream_message(stored_message, message):
complete_message = self._stream_message(message, stored_message.id)
message_table = update_message(message_id=stored_message.id, message={"text": complete_message})
stored_message = Message(**message_table.model_dump())
self.vertex._added_message = stored_message
stored_message = self._update_stored_message(stored_message.id, complete_message)

self.status = stored_message
return stored_message

def _send_message_event(self, message: Message):
if hasattr(self, "_event_manager") and self._event_manager:
self._event_manager.on_message(data=message.data)

def _should_stream_message(self, stored_message: Message, original_message: Message) -> bool:
return bool(
hasattr(self, "_event_manager")
and self._event_manager
and stored_message.id
and not isinstance(original_message.text, str)
)

def _update_stored_message(self, message_id: str, complete_message: str) -> Message:
message_table = update_message(message_id=message_id, message={"text": complete_message})
updated_message = Message(**message_table.model_dump())
self.vertex._added_message = updated_message
return updated_message

def _process_chunk(self, chunk: str, complete_message: str, message: Message, message_id: str) -> str:
complete_message += chunk
data = {
"text": complete_message,
"chunk": chunk,
"sender": message.sender,
"sender_name": message.sender_name,
"id": str(message_id),
}
if self._event_manager:
self._event_manager.on_token(data=data)
self._event_manager.on_token(
data={
"text": complete_message,
"chunk": chunk,
"sender": message.sender,
"sender_name": message.sender_name,
"id": str(message_id),
}
)
return complete_message

async def _handle_async_iterator(self, iterator: AsyncIterator, message: Message, message_id: str) -> str:
Expand All @@ -69,7 +79,6 @@ def _stream_message(self, message: Message, message_id: str) -> str:
complete_message = ""
for chunk in iterator:
complete_message = self._process_chunk(chunk.content, complete_message, message, message_id)

return complete_message

def build_with_data(
Expand All @@ -80,22 +89,25 @@ def build_with_data(
input_value: str | Data | Message | None = None,
files: list[str] | None = None,
session_id: str | None = None,
return_message: bool | None = False,
) -> Message:
if isinstance(input_value, Data):
# Update the data of the record
message = Message.from_data(input_value)
else:
message = Message(
text=input_value, sender=sender, sender_name=sender_name, files=files, session_id=session_id
)
return_message: bool = False,
) -> str | Message:
message = self._create_message(input_value, sender, sender_name, files, session_id)
message_text = message.text if not return_message else message

self.status = message_text
if session_id and isinstance(message, Message) and isinstance(message.text, str):
messages = store_message(
message,
flow_id=self.graph.flow_id,
)
messages = store_message(message, flow_id=self.graph.flow_id)
self.status = messages
return message_text # type: ignore[return-value]
self._send_messages_events(messages)

return cast(str | Message, message_text)

def _create_message(self, input_value, sender, sender_name, files, session_id) -> Message:
if isinstance(input_value, Data):
return Message.from_data(input_value)
return Message(text=input_value, sender=sender, sender_name=sender_name, files=files, session_id=session_id)

def _send_messages_events(self, messages):
if hasattr(self, "_event_manager") and self._event_manager:
for stored_message in messages:
self._event_manager.on_message(data=stored_message.data)
15 changes: 4 additions & 11 deletions src/backend/base/langflow/components/inputs/ChatInput.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
from langflow.base.io.chat import ChatComponent
from langflow.inputs import BoolInput
from langflow.io import DropdownInput, FileInput, MessageTextInput, MultilineInput, Output
from langflow.memory import store_message
from langflow.schema.message import Message
from langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_USER, MESSAGE_SENDER_USER

Expand Down Expand Up @@ -69,18 +68,12 @@ def message_response(self) -> Message:
session_id=self.session_id,
files=self.files,
)

if (
self.session_id
and isinstance(message, Message)
and isinstance(message.text, str)
and self.should_store_message
):
store_message(
if self.session_id and isinstance(message, Message) and self.should_store_message:
stored_message = self.store_message(
message,
flow_id=self.graph.flow_id,
)
self.message.value = message
self.message.value = stored_message
message = stored_message

self.status = message
return message
14 changes: 4 additions & 10 deletions src/backend/base/langflow/components/outputs/ChatOutput.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
from langflow.base.io.chat import ChatComponent
from langflow.inputs import BoolInput
from langflow.io import DropdownInput, MessageTextInput, Output
from langflow.memory import store_message
from langflow.schema.message import Message
from langflow.utils.constants import MESSAGE_SENDER_AI, MESSAGE_SENDER_NAME_AI, MESSAGE_SENDER_USER

Expand Down Expand Up @@ -65,17 +64,12 @@ def message_response(self) -> Message:
sender_name=self.sender_name,
session_id=self.session_id,
)
if (
self.session_id
and isinstance(message, Message)
and isinstance(message.text, str)
and self.should_store_message
):
store_message(
if self.session_id and isinstance(message, Message) and self.should_store_message:
stored_message = self.store_message(
message,
flow_id=self.graph.flow_id,
)
self.message.value = message
self.message.value = stored_message
message = stored_message

self.status = message
return message
4 changes: 3 additions & 1 deletion src/backend/base/langflow/events/event_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
import uuid
from functools import partial

from fastapi.encoders import jsonable_encoder
from typing_extensions import Protocol

from langflow.schema.log import LoggableType
Expand Down Expand Up @@ -52,7 +53,8 @@ def register_event(self, name: str, event_type: str, callback: EventCallback | N
self.events[name] = _callback

def send_event(self, *, event_type: str, data: LoggableType):
json_data = {"event": event_type, "data": data}
jsonable_data = jsonable_encoder(data)
json_data = {"event": event_type, "data": jsonable_data}
event_id = uuid.uuid4()
str_data = json.dumps(json_data) + "\n\n"
self.queue.put_nowait((event_id, str_data.encode("utf-8"), time.time()))
Expand Down
15 changes: 12 additions & 3 deletions src/backend/base/langflow/graph/vertex/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -793,11 +793,20 @@ async def build(
# and we are just getting the result for the requester
return await self.get_requester_result(requester)
self._reset()

# inject session_id if it is not None
if inputs is not None and "session" in inputs and inputs["session"] is not None and self.has_session_id:
session_id_value = self.get_value_from_template_dict("session_id")
if session_id_value == "":
self.update_raw_params({"session_id": inputs["session"]}, overwrite=True)
if self._is_chat_input() and (inputs or files):
chat_input = {}
if inputs:
chat_input.update({"input_value": inputs.get(INPUT_FIELD_NAME, "")})
if (
inputs
and isinstance(inputs, dict)
and "input_value" in inputs
and inputs["input_value"] is not None
):
chat_input.update({"input_value": inputs[INPUT_FIELD_NAME]})
if files:
chat_input.update({"files": files})

Expand Down
Loading

0 comments on commit 7be4ebf

Please sign in to comment.