Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Autogenstudio Updates [Upload/Dowload of Skills/Workflows, Streaming Agent Replies, Agent Message Summarization] #1801

Merged
merged 58 commits into from
Mar 16, 2024
Merged
Show file tree
Hide file tree
Changes from 29 commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
3f4b6e5
update default agent system message
victordibia Feb 6, 2024
508c260
session friendly name functionality
jluey1 Feb 6, 2024
cde2a61
minor formatting
jluey1 Feb 6, 2024
00542fa
fix issues with groupchat and version bump
victordibia Feb 7, 2024
95225fb
fix issues with groupchat and version bump. address #1580
victordibia Feb 7, 2024
0cf32d6
Merge branch 'autogenstudio' of github.com:microsoft/autogen into aut…
victordibia Feb 7, 2024
1abb1da
update groupchat system message
victordibia Feb 7, 2024
df67efc
add support for copying message in chatbox
victordibia Feb 7, 2024
393a7bd
rewrite how agent history is maintained in workflow manager. Directly…
victordibia Feb 7, 2024
f324f55
general qol updates
victordibia Feb 8, 2024
4c6f10e
add support for downloading + copying skills, models and agents from UI
victordibia Feb 9, 2024
7d73cc6
add regex check to agent name, address #1507
victordibia Feb 9, 2024
2be65b7
add support for uploading workflow files
victordibia Feb 9, 2024
b3e8193
refactor, add support for streaming intermediate agent response to ui
victordibia Feb 17, 2024
ab7689e
improve streaming ux
victordibia Feb 22, 2024
4b35780
add support for uploading skills, models, agents, workflows
victordibia Feb 22, 2024
235f0d3
add datamodel for socket message
victordibia Feb 23, 2024
f42e9bd
version update
victordibia Feb 23, 2024
62edeab
fix chatbox height bug
victordibia Feb 23, 2024
13c63a5
fix csv pagination issue
victordibia Feb 23, 2024
d733c12
improve hidden menu for uploading entities
victordibia Feb 23, 2024
3ab348a
fix minor issue with animation timing on chat interface
victordibia Feb 23, 2024
19830a1
version bump, css fixes
victordibia Feb 26, 2024
cba783a
Merge pull request #1565 from jluey1/main
victordibia Feb 26, 2024
b3a35e1
use description field in autogen conversable class for description
victordibia Feb 27, 2024
3ac352d
add implementation for llm summarization of agent chat
victordibia Feb 27, 2024
04fe51c
support for llm summary of agent history
victordibia Feb 27, 2024
1e76efb
formatting fixes
victordibia Feb 27, 2024
84ecb38
formatting updates
victordibia Feb 27, 2024
17ddf3a
add dockerfile to run autogenstudio in a docker contailer
victordibia Mar 5, 2024
150ca56
autogenstudio docker container
victordibia Mar 5, 2024
f6ac75e
updates to websockets
victordibia Mar 5, 2024
54ec7e5
update socket connection logic,
victordibia Mar 5, 2024
655c1ba
support using socket for passing message requests where a socket is a…
victordibia Mar 5, 2024
48de3ef
improve command for building frontend
victordibia Mar 5, 2024
2e51cb3
formatting updates
victordibia Mar 5, 2024
1f94cc8
remove duplicated code
cillyfly Mar 7, 2024
378223b
Merge pull request #1897 from cillyfly/patch-1
victordibia Mar 7, 2024
ee9d1dc
update description location
cillyfly Mar 7, 2024
7b331a2
Merge pull request #1900 from cillyfly/patch-2
victordibia Mar 7, 2024
5583395
version bump
victordibia Mar 11, 2024
f3acd98
refactor to ensure each session and call within a session has an inde…
victordibia Mar 11, 2024
39ccc8d
support use of socket for sending messages where available
victordibia Mar 11, 2024
f58d304
use rsync to copy built files to ui direction instead of cp -rT
victordibia Mar 11, 2024
309e284
spelling correctino
victordibia Mar 11, 2024
d0de02d
Merge branch 'autogenstudio' of github.com:microsoft/autogen into aut…
victordibia Mar 11, 2024
451e0ea
readme update
victordibia Mar 13, 2024
8f6c89a
fix numpy version
victordibia Mar 13, 2024
938a964
version bump
victordibia Mar 13, 2024
15d043f
add support for dot env variables and updating default app dir to /ho…
victordibia Mar 13, 2024
4aef2aa
formatting update
victordibia Mar 13, 2024
cfe0ef2
Merge branch 'main' into autogenstudio
victordibia Mar 13, 2024
6e91728
Merge branch 'main' into autogenstudio
victordibia Mar 13, 2024
a686bbb
Merge branch 'main' into autogenstudio
victordibia Mar 14, 2024
d7874b6
update gitignore
victordibia Mar 14, 2024
eb30d2a
Merge branch 'autogenstudio' of github.com:microsoft/autogen into aut…
victordibia Mar 14, 2024
2c714b2
formatting updates
victordibia Mar 14, 2024
1000432
Merge branch 'main' into autogenstudio
victordibia Mar 14, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions notebook/agentchat_custom_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -383,6 +383,7 @@
"source": [
"# load model here\n",
"\n",
"\n",
"config = config_list_custom[0]\n",
"device = config.get(\"device\", \"cpu\")\n",
"loaded_model = AutoModelForCausalLM.from_pretrained(config[\"model\"]).to(device)\n",
Expand Down
219 changes: 188 additions & 31 deletions samples/apps/autogen-studio/autogenstudio/chatmanager.py
Original file line number Diff line number Diff line change
@@ -1,63 +1,220 @@
import json
from queue import Queue
import time
from typing import List
from .datamodel import AgentWorkFlowConfig, Message
from .utils import extract_successful_code_blocks, get_default_agent_config, get_modified_files
from .workflowmanager import AutoGenWorkFlowManager
from typing import Any, List, Dict, Optional
import os
from fastapi import WebSocket, WebSocketDisconnect
import websockets
from .datamodel import AgentWorkFlowConfig, Message, SocketMessage
from .utils import extract_successful_code_blocks, get_modified_files, summarize_chat_history
from .workflowmanager import AutoGenWorkFlowManager


class AutoGenChatManager:
def __init__(self) -> None:
pass
"""
This class handles the automated generation and management of chat interactions
using an automated workflow configuration and message queue.
"""

def __init__(self, message_queue: Queue) -> None:
"""
Initializes the AutoGenChatManager with a message queue.

:param message_queue: A queue to use for sending messages asynchronously.
"""
self.message_queue = message_queue

def send(self, message: str) -> None:
"""
Sends a message by putting it into the message queue.

:param message: The message string to be sent.
"""
if self.message_queue is not None:
self.message_queue.put_nowait(message)

def chat(
self,
message: Message,
history: List[Dict[str, Any]],
flow_config: Optional[AgentWorkFlowConfig] = None,
connection_id: Optional[str] = None,
**kwargs,
) -> Message:
"""
Processes an incoming message according to the agent's workflow configuration
and generates a response.

def chat(self, message: Message, history: List, flow_config: AgentWorkFlowConfig = None, **kwargs) -> None:
:param message: An instance of `Message` representing an incoming message.
:param history: A list of dictionaries, each representing a past interaction.
:param flow_config: An instance of `AgentWorkFlowConfig`. If None, defaults to a standard configuration.
:param connection_id: An optional connection identifier.
:param kwargs: Additional keyword arguments.
:return: An instance of `Message` representing a response.
"""
work_dir = kwargs.get("work_dir", None)
if work_dir is None:
raise ValueError("work_dir must be specified")

scratch_dir = os.path.join(work_dir, "scratch")
os.makedirs(scratch_dir, exist_ok=True)

# if no flow config is provided, use the default
if flow_config is None:
flow_config = get_default_agent_config(scratch_dir)
raise ValueError("flow_config must be specified")

flow = AutoGenWorkFlowManager(
config=flow_config,
history=history,
work_dir=scratch_dir,
send_message_function=self.send,
connection_id=connection_id,
)

flow = AutoGenWorkFlowManager(config=flow_config, history=history, work_dir=scratch_dir)
message_text = message.content.strip()

output = ""
start_time = time.time()

metadata = {}
flow.run(message=f"{message_text}", clear_history=False)
end_time = time.time()

metadata["messages"] = flow.agent_history
metadata = {
"messages": flow.agent_history,
"summary_method": flow_config.summary_method,
"time": end_time - start_time,
"code": "", # Assuming that this is intentionally left empty
"files": get_modified_files(start_time, end_time, scratch_dir, dest_dir=work_dir),
}

output = ""
print("Modified files: ", len(metadata["files"]))

output = self._generate_output(message_text, flow, flow_config)

output_message = Message(
user_id=message.user_id,
root_msg_id=message.root_msg_id,
role="assistant",
content=output,
metadata=json.dumps(metadata),
session_id=message.session_id,
)

return output_message

def _generate_output(
self, message_text: str, flow: AutoGenWorkFlowManager, flow_config: AgentWorkFlowConfig
) -> str:
"""
Generates the output response based on the workflow configuration and agent history.

:param message_text: The text of the incoming message.
:param flow: An instance of `AutoGenWorkFlowManager`.
:param flow_config: An instance of `AgentWorkFlowConfig`.
:return: The output response as a string.
"""

output = ""
if flow_config.summary_method == "last":
successful_code_blocks = extract_successful_code_blocks(flow.agent_history)
last_message = flow.agent_history[-1]["message"]["content"] if flow.agent_history else ""
successful_code_blocks = "\n\n".join(successful_code_blocks)
output = (last_message + "\n" + successful_code_blocks) if successful_code_blocks else last_message
elif flow_config.summary_method == "llm":
output = ""
model = flow.config.receiver.config.llm_config.config_list[0]
status_message = SocketMessage(
type="agent_status",
data={"status": "summarizing", "message": "Generating summary of agent dialogue"},
connection_id=flow.connection_id,
)
self.send(status_message.dict())
output = summarize_chat_history(task=message_text, messages=flow.agent_history, model=model)
print("Output: ", output)

elif flow_config.summary_method == "none":
output = ""

metadata["code"] = ""
metadata["summary_method"] = flow_config.summary_method
end_time = time.time()
metadata["time"] = end_time - start_time
modified_files = get_modified_files(start_time, end_time, scratch_dir, dest_dir=work_dir)
metadata["files"] = modified_files
return output

print("Modified files: ", len(modified_files))

output_message = Message(
user_id=message.user_id,
root_msg_id=message.root_msg_id,
role="assistant",
content=output,
metadata=json.dumps(metadata),
session_id=message.session_id,
)
class WebSocketConnectionManager:
"""
Manages WebSocket connections including sending, broadcasting, and managing the lifecycle of connections.
"""

return output_message
def __init__(self, active_connections: List[WebSocket] = None) -> None:
"""
Initializes WebSocketConnectionManager with an optional list of active WebSocket connections.

:param active_connections: A list of WebSocket objects representing the current active connections.
"""
if active_connections is None:
active_connections = []
self.active_connections: List[WebSocket] = active_connections
self.socket_store: Dict[WebSocket, str] = {}

async def connect(self, websocket: WebSocket, client_id: str) -> None:
"""
Accepts a new WebSocket connection and appends it to the active connections list.

:param websocket: The WebSocket instance representing a client connection.
:param client_id: A string representing the unique identifier of the client.
"""
await websocket.accept()
self.active_connections.append(websocket)
self.socket_store[websocket] = client_id
print(f"New Connection: {client_id}, Total: {len(self.active_connections)}")

def disconnect(self, websocket: WebSocket) -> None:
"""
Disconnects and removes a WebSocket connection from the active connections list.

:param websocket: The WebSocket instance to remove.
"""
try:
self.active_connections.remove(websocket)
del self.socket_store[websocket]
print(f"Connection Closed. Total: {len(self.active_connections)}")
except ValueError:
print("Error: WebSocket connection not found")

def disconnect_all(self) -> None:
"""
Disconnects all active WebSocket connections.
"""
for connection in self.active_connections[:]:
self.disconnect(connection)

async def send_message(self, message: Dict, websocket: WebSocket) -> None:
"""
Sends a JSON message to a single WebSocket connection.

:param message: A JSON serializable dictionary containing the message to send.
:param websocket: The WebSocket instance through which to send the message.
"""
try:
await websocket.send_json(message)
except WebSocketDisconnect:
print("Error: Tried to send a message to a closed WebSocket")
self.disconnect(websocket)
except websockets.exceptions.ConnectionClosedOK:
print("Error: WebSocket connection closed normally")
self.disconnect(websocket)

async def broadcast(self, message: str) -> None:
"""
Broadcasts a text message to all active WebSocket connections.

:param message: A string containing the message to broadcast.
"""
for connection in self.active_connections[:]:
try:
if connection.client_state == websockets.protocol.State.OPEN:
await connection.send_text(message)
else:
print("Error: WebSocket connection is closed")
self.disconnect(connection)
except WebSocketDisconnect:
print("Error: Tried to send a message to a closed WebSocket")
self.disconnect(connection)
except websockets.exceptions.ConnectionClosedOK:
print("Error: WebSocket connection closed normally")
self.disconnect(connection)
16 changes: 14 additions & 2 deletions samples/apps/autogen-studio/autogenstudio/datamodel.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,7 @@ class AgentConfig:
is_termination_msg: Optional[Union[bool, str, Callable]] = None
code_execution_config: Optional[Union[bool, str, Dict[str, Any]]] = None
default_auto_reply: Optional[str] = ""
description: Optional[str] = None

def dict(self):
result = asdict(self)
Expand All @@ -129,7 +130,6 @@ class AgentFlowSpec:
timestamp: Optional[str] = None
user_id: Optional[str] = None
skills: Optional[Union[None, List[Skill]]] = None
description: Optional[str] = None

def __post_init__(self):
if self.timestamp is None:
Expand Down Expand Up @@ -173,7 +173,6 @@ class GroupChatFlowSpec:
id: Optional[str] = None
timestamp: Optional[str] = None
user_id: Optional[str] = None
description: Optional[str] = None
skills: Optional[Union[None, List[Skill]]] = None

def __post_init__(self):
Expand Down Expand Up @@ -301,3 +300,16 @@ class DBWebRequestModel(object):
agent: Optional[AgentFlowSpec] = None
workflow: Optional[AgentWorkFlowConfig] = None
model: Optional[Model] = None
message: Optional[Message] = None
connection_id: Optional[str] = None


@dataclass
class SocketMessage(object):
connection_id: str
data: Dict[str, Any]
type: str

def dict(self):
result = asdict(self)
return result
14 changes: 8 additions & 6 deletions samples/apps/autogen-studio/autogenstudio/utils/dbdefaults.json
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"agents": [
{
"type": "userproxy",
"description": "A user proxy agent that executes code.",

"config": {
"name": "userproxy",
"human_input_mode": "NEVER",
Expand All @@ -33,12 +33,12 @@
"code_execution_config": {
"work_dir": null,
"use_docker": false
}
},
"description": "A user proxy agent that executes code."
}
},
{
"type": "assistant",
"description": "A primary assistant agent that writes plans and code to solve tasks.",
"skills": [
{
"title": "find_papers_arxiv",
Expand All @@ -54,6 +54,7 @@
],
"config": {
"name": "primary_assistant",
"description": "A primary assistant agent that writes plans and code to solve tasks.",
"llm_config": {
"config_list": [
{
Expand Down Expand Up @@ -138,7 +139,7 @@
},
"human_input_mode": "NEVER",
"max_consecutive_auto_reply": 8,
"system_message": "You are a helpful assistant that can suggest a travel plan for a user. You are the primary cordinator who will receive suggestions or advice from other agents (local_assistant, language_assistant). You must ensure that the finally plan integrates the suggestions from other agents or team members. YOUR FINAL RESPONSE MUST BE THE COMPLETE PLAN that ends with the word TERMINATE. "
"system_message": "You are a helpful assistant that can suggest a travel plan for a user. You are the primary cordinator who will receive suggestions or advice from other agents (local_assistant, language_assistant). You must ensure that the finally plan integrates the suggestions from other agents or team members. YOUR FINAL RESPONSE MUST BE THE COMPLETE PLAN. When the plan is complete and all perspectives are integrated, you can respond with TERMINATE."
}
},
{
Expand Down Expand Up @@ -188,9 +189,9 @@
"description": "This workflow is used for general purpose tasks.",
"sender": {
"type": "userproxy",
"description": "A user proxy agent that executes code.",
"config": {
"name": "userproxy",
"description": "A user proxy agent that executes code.",
"human_input_mode": "NEVER",
"max_consecutive_auto_reply": 10,
"system_message": "You are a helpful assistant.",
Expand All @@ -204,7 +205,7 @@
},
"receiver": {
"type": "assistant",
"description": "Default assistant to generate plans and write code to solve tasks.",

"skills": [
{
"title": "find_papers_arxiv",
Expand All @@ -218,6 +219,7 @@
}
],
"config": {
"description": "Default assistant to generate plans and write code to solve tasks.",
"name": "primary_assistant",
"llm_config": {
"config_list": [
Expand Down
Loading
Loading