Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add another batch of notebooks to the website #1969

Merged
merged 3 commits into from
Mar 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
88 changes: 19 additions & 69 deletions notebook/agentchat_groupchat_vis.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,35 +5,21 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"<a href=\"https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/agentchat_groupchat_vis.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# Auto Generated Agent Chat: Group Chat with Coder and Visualization Critic\n",
"# Group Chat with Coder and Visualization Critic\n",
"\n",
"AutoGen offers conversable agents powered by LLM, tool or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation.\n",
"Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
"\n",
"## Requirements\n",
"\n",
"AutoGen requires `Python>=3.8`. To run this notebook example, please install:\n",
"````{=mdx}\n",
":::info Requirements\n",
"Install `pyautogen`:\n",
"```bash\n",
"pip install pyautogen\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"%%capture --no-stderr\n",
"# %pip install \"pyautogen>=0.2.3\""
"```\n",
"\n",
"For more information, please refer to the [installation guide](/docs/installation/).\n",
":::\n",
"````"
]
},
{
Expand Down Expand Up @@ -64,60 +50,20 @@
" filter_dict={\n",
" \"model\": [\"gpt-4\", \"gpt-4-0314\", \"gpt4\", \"gpt-4-32k\", \"gpt-4-32k-0314\", \"gpt-4-32k-v0314\"],\n",
" },\n",
")\n",
"# config_list_gpt35 = autogen.config_list_from_json(\n",
"# \"OAI_CONFIG_LIST\",\n",
"# filter_dict={\n",
"# \"model\": {\n",
"# \"gpt-3.5-turbo\",\n",
"# \"gpt-3.5-turbo-16k\",\n",
"# \"gpt-3.5-turbo-0301\",\n",
"# \"chatgpt-35-turbo-0301\",\n",
"# \"gpt-35-turbo-v0301\",\n",
"# },\n",
"# },\n",
"# )"
")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"It first looks for environment variable \"OAI_CONFIG_LIST\" which needs to be a valid json string. If that variable is not found, it then looks for a json file named \"OAI_CONFIG_LIST\". It filters the configs by models (you can filter by other keys as well). Only the gpt-4 models are kept in the list based on the filter condition.\n",
"\n",
"The config list looks like the following:\n",
"```python\n",
"config_list = [\n",
" {\n",
" 'model': 'gpt-4',\n",
" 'api_key': '<your OpenAI API key here>',\n",
" },\n",
" {\n",
" 'model': 'gpt-4',\n",
" 'api_key': '<your Azure OpenAI API key here>',\n",
" 'base_url': '<your Azure OpenAI API base here>',\n",
" 'api_type': 'azure',\n",
" 'api_version': '2024-02-15-preview',\n",
" },\n",
" {\n",
" 'model': 'gpt-4-32k',\n",
" 'api_key': '<your Azure OpenAI API key here>',\n",
" 'base_url': '<your Azure OpenAI API base here>',\n",
" 'api_type': 'azure',\n",
" 'api_version': '2024-02-15-preview',\n",
" },\n",
"]\n",
"```\n",
"````{=mdx}\n",
":::tip\n",
"Learn more about configuring LLMs for agents [here](/docs/topics/llm_configuration).\n",
":::\n",
"````\n",
"\n",
"You can set the value of config_list in any way you prefer. Please refer to this [notebook](/docs/topics/llm_configuration) for full code examples of the different methods."
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Construct Agents"
]
},
Expand Down Expand Up @@ -1029,6 +975,10 @@
}
],
"metadata": {
"front_matter": {
"tags": ["group chat"],
"description": "Explore a group chat example using agents such as a coder and visualization agent."
},
"kernelspec": {
"display_name": "flaml",
"language": "python",
Expand Down
45 changes: 33 additions & 12 deletions notebook/agentchat_nestedchat_optiguide.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,17 @@
" \n",
"In addition to AutoGen, this notebook also requires eventlet and Gurobipy. The eventlet package is used in this notebook to constrain code execution with a timeout, and the gurobipy package is a mathematical optimization software library for solving mixed-integer linear and quadratic optimization problems. \n",
"\n",
"````{=mdx}\n",
":::info Requirements\n",
"Some extra dependencies are needed for this notebook, which can be installed via pip:\n",
"\n",
"```bash\n",
"pip install pyautogen eventlet gurobipy\n",
"```\n",
"\n",
"For more information, please refer to the [installation guide](/docs/installation/).\n",
":::\n",
"````\n",
"\n"
]
},
Expand Down Expand Up @@ -54,7 +62,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# OptiGuide with Nested Chat\n",
"````{=mdx}\n",
":::tip\n",
"Learn more about configuring LLMs for agents [here](/docs/topics/llm_configuration).\n",
":::\n",
"````\n",
"\n",
"Intended agent orchestration in OptiGuide."
]
Expand All @@ -67,19 +79,19 @@
"\n",
"\n",
"\n",
"### Step 0. Prepare Helper Functions\n",
"## Step 0. Prepare Helper Functions\n",
"The code cell below includes several helper functions to be used by agents. The helper functions are adopted directly from [OptiGuide](https://github.com/microsoft/OptiGuide).\n",
"\n",
"#### Utility Functions\n",
"### Utility Functions\n",
"Several utility functions (replace, insert_code, run_with_exec) are defined to manipulate and execute the source code dynamically:\n",
"\n",
"- **replace(src_code, old_code, new_code) -> str:**<br>Replaces a specified block of code within the source code with new code. This is essential for updating the code dynamically based on chatbot and user interactions.\n",
"- **replace(src_code, old_code, new_code) -> str:**<br/>Replaces a specified block of code within the source code with new code. This is essential for updating the code dynamically based on chatbot and user interactions.\n",
"\n",
"- **insert_code(src_code, new_lines) -> str:**<br>Determines where to insert new lines of code based on specific markers in the source code. It's used to seamlessly integrate generated code snippets.\n",
"- **insert_code(src_code, new_lines) -> str:**<br/>Determines where to insert new lines of code based on specific markers in the source code. It's used to seamlessly integrate generated code snippets.\n",
"\n",
"- **run_with_exec(src_code) -> Union[str, Exception]:**<br>Executes the modified source code within a controlled environment, capturing and returning the output or any errors that occur. This function is crucial for testing the feasibility and correctness of the generated code solutions.\n",
"- **run_with_exec(src_code) -> Union[str, Exception]:**<br/>Executes the modified source code within a controlled environment, capturing and returning the output or any errors that occur. This function is crucial for testing the feasibility and correctness of the generated code solutions.\n",
"\n",
"#### Functions External Code Retrieval\n",
"### Functions External Code Retrieval\n",
"The cell also includes logic to retrieve example source code from a remote repository using the requests library. This example code serves as a base for the chatbot to work with, allowing users to see real-life applications of the concepts discussed.\n",
"\n",
"The retrieved code is then displayed, showing both the beginning and the end of the source code to give a glimpse of its structure and content."
Expand Down Expand Up @@ -201,13 +213,13 @@
"\n",
"This cell introduces the Writer and OptiGuide agent classes and their instances to manage the interaction between the user, the chatbot, and the optimization solver. This streamlines the process of generating, evaluating, and integrating code solutions for supply chain optimization problems.\n",
"\n",
"#### Classes Defined\n",
"### Classes Defined\n",
"\n",
"- **`OptiGuide`**: Inherits from `autogen.AssistantAgent` and serves as the main class for handling the supply chain optimization logic. It maintains state information like the source code, debugging attempts left, success status, and user chat history. Key methods include `set_success` and `update_debug_times`, which are used to update the agent's state based on the outcomes of interactions.\n",
"\n",
"- **`Writer`**: Also inherits from `autogen.AssistantAgent`, this class is tailored to manage the generation and explanation of Python code solutions. It keeps track of the source code and example Q&A to assist in generating responses to user queries.\n",
"\n",
"#### Agent Instances\n",
"### Agent Instances\n",
"\n",
"- **`writer`**, **`safeguard`**, and **`optiguide_commander`**: Instances of the classes defined above, each configured with specific roles in the code generation and evaluation process. These agents work together to ensure that the user's questions are answered with safe, optimized, and understandable Python code solutions.\n",
"\n",
Expand Down Expand Up @@ -402,7 +414,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Step 2. Orchestrate Nested Chats \n",
"## Step 2. Orchestrate Nested Chats \n",
"\n",
"These three agent instances are orchestrated in the following way with the `writer` and `safeguard` nested into the `commander` agent as the inner monologue.\n",
"\n",
Expand Down Expand Up @@ -507,7 +519,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Let the agents talk"
"### Let the agents talk"
]
},
{
Expand Down Expand Up @@ -1073,7 +1085,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Get Final Results from the Returned ChatResult Object "
"### Get Final Results from the Returned ChatResult Object "
]
},
{
Expand Down Expand Up @@ -1103,6 +1115,15 @@
}
],
"metadata": {
"extra_files_to_copy": [
"optiGuide_new_design.png"
],
"front_matter": {
"description": "This is a nested chat re-implementation of OptiGuide which is an LLM-based supply chain optimization framework.",
"tags": [
"nested chat"
]
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
Expand Down
58 changes: 17 additions & 41 deletions notebook/agentchat_qdrant_RetrieveChat.ipynb
Original file line number Diff line number Diff line change
@@ -1,18 +1,10 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a href=\"https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/agentchat_qdrant_RetrieveChat.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"toc\"></a>\n",
"# Using RetrieveChat with Qdrant for Retrieve Augmented Code Generation and Question Answering\n",
"\n",
"[Qdrant](https://qdrant.tech/) is a high-performance vector search engine/database.\n",
Expand All @@ -24,13 +16,17 @@
"\n",
"We'll demonstrate usage of RetrieveChat with Qdrant for code generation and question answering w/ human feedback.\n",
"\n",
"````{=mdx}\n",
":::info Requirements\n",
"Some extra dependencies are needed for this notebook, which can be installed via pip:\n",
"\n",
"## Requirements\n",
"\n",
"AutoGen requires `Python>=3.8`. To run this notebook example, please install the [retrievechat] option.\n",
"```bash\n",
"pip install \"pyautogen[retrievechat]>=0.2.3\" \"flaml[automl]\" \"qdrant_client[fastembed]\"\n",
"```"
"```\n",
"\n",
"For more information, please refer to the [installation guide](/docs/installation/).\n",
":::\n",
"````"
]
},
{
Expand Down Expand Up @@ -253,35 +249,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"It first looks for environment variable \"OAI_CONFIG_LIST\" which needs to be a valid json string. If that variable is not found, it then looks for a json file named \"OAI_CONFIG_LIST\". It filters the configs by models (you can filter by other keys as well). Only the gpt-4 and gpt-3.5-turbo models are kept in the list based on the filter condition.\n",
"\n",
"The config list looks like the following:\n",
"```python\n",
"config_list = [\n",
" {\n",
" 'model': 'gpt-4',\n",
" 'api_key': '<your OpenAI API key here>',\n",
" },\n",
" {\n",
" 'model': 'gpt-4',\n",
" 'api_key': '<your Azure OpenAI API key here>',\n",
" 'base_url': '<your Azure OpenAI API base here>',\n",
" 'api_type': 'azure',\n",
" 'api_version': '2024-02-15-preview',\n",
" },\n",
" {\n",
" 'model': 'gpt-3.5-turbo',\n",
" 'api_key': '<your Azure OpenAI API key here>',\n",
" 'base_url': '<your Azure OpenAI API base here>',\n",
" 'api_type': 'azure',\n",
" 'api_version': '2024-02-15-preview',\n",
" },\n",
"]\n",
"```\n",
"\n",
"If you open this notebook in colab, you can upload your files by clicking the file icon on the left panel and then choose \"upload file\" icon.\n",
"\n",
"You can set the value of config_list in other ways you prefer, e.g., loading from a YAML file."
"````{=mdx}\n",
":::tip\n",
"Learn more about configuring LLMs for agents [here](/docs/topics/llm_configuration).\n",
":::\n",
"````"
]
},
{
Expand Down Expand Up @@ -1058,6 +1030,10 @@
}
],
"metadata": {
"front_matter": {
"tags": ["rag"],
"description": "This notebook demonstrates the usage of QdrantRetrieveUserProxyAgent for RAG."
},
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
Expand Down
8 changes: 4 additions & 4 deletions website/docs/Examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,17 +12,17 @@ Links to notebook examples:

- Automated Task Solving with Code Generation, Execution & Debugging - [View Notebook](/docs/notebooks/agentchat_auto_feedback_from_code_execution)
- Automated Code Generation and Question Answering with Retrieval Augmented Agents - [View Notebook](/docs/notebooks/agentchat_RetrieveChat)
- Automated Code Generation and Question Answering with [Qdrant](https://qdrant.tech/) based Retrieval Augmented Agents - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_qdrant_RetrieveChat.ipynb)
- Automated Code Generation and Question Answering with [Qdrant](https://qdrant.tech/) based Retrieval Augmented Agents - [View Notebook](/docs/notebooks/agentchat_qdrant_RetrieveChat)

1. **Multi-Agent Collaboration (>3 Agents)**

- Automated Task Solving by Group Chat (with 3 group member agents and 1 manager agent) - [View Notebook](/docs/notebooks/agentchat_groupchat)
- Automated Data Visualization by Group Chat (with 3 group member agents and 1 manager agent) - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_groupchat_vis.ipynb)
- Automated Data Visualization by Group Chat (with 3 group member agents and 1 manager agent) - [View Notebook](/docs/notebooks/agentchat_groupchat_vis)
- Automated Complex Task Solving by Group Chat (with 6 group member agents and 1 manager agent) - [View Notebook](/docs/notebooks/agentchat_groupchat_research)
- Automated Task Solving with Coding & Planning Agents - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_planning.ipynb)
- Automated Task Solving with transition paths specified in a graph - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_graph_modelling_language_using_select_speaker.ipynb)
- Running a group chat as an inner-monolgue via the SocietyOfMindAgent - [View Notebook](/docs/notebooks/agentchat_society_of_mind)
- Running a group chat with custom speaker selection function - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_groupchat_customized.ipynb)
- Running a group chat with custom speaker selection function - [View Notebook](/docs/notebooks/agentchat_groupchat_customized)

1. **Sequential Multi-Agent Chats**
- Solving Multiple Tasks in a Sequence of Chats Initiated by a Single Agent - [View Notebook](/docs/notebooks/agentchat_multi_task_chats)
Expand All @@ -31,7 +31,7 @@ Links to notebook examples:

1. **Nested Chats**
- Solving Complex Tasks with Nested Chats - [View Notebook](/docs/notebooks/agentchat_nestedchat)
- OptiGuide for Solving a Supply Chain Optimization Problem with Nested Chats with a Coding Agent and a Safeguard Agent - [View Notebook](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_nestedchat_optiguide.ipynb)
- OptiGuide for Solving a Supply Chain Optimization Problem with Nested Chats with a Coding Agent and a Safeguard Agent - [View Notebook](/docs/notebooks/agentchat_nestedchat_optiguide)

1. **Applications**

Expand Down
Loading