Skip to content

Commit cc304b3

Browse files
ekzhujackgerritsgagbjoshkyh
authored
AutoGen Tutorial (microsoft#1702)
* update intro * update intro * tutorial * update notebook * update notebooks * update * merge * add conversation patterns * rename; delete unused files. * Reorganize new guides * Improve intro, fix typos * add what is next * outline for code executor * initiate chats png * Improve language * Improve language of human in the loop tutorial * update * update * Update group chat * code executor * update convsersation patterns * update code executor section to use legacy code executor * update conversation pattern * redirect * update figures * update whats next * Break down chapter 2 into two chapters * udpate * fix website build * Minor corrections of typos and grammar. * remove broken links, update sidebar * code executor update * Suggest changes to the code executor section * update what is next * reorder * update getting started * title * update navbar * Delete website/docs/tutorial/what-is-next.ipynb * update conversable patterns * Improve language * Fix typo * minor fixes --------- Co-authored-by: Jack Gerrits <[email protected]> Co-authored-by: gagb <[email protected]> Co-authored-by: Joshua Kim <[email protected]> Co-authored-by: Jack Gerrits <[email protected]>
1 parent bea8d09 commit cc304b3

22 files changed

+3822
-82
lines changed

.pre-commit-config.yaml

+1
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,7 @@ repos:
4141
pyproject.toml |
4242
website/static/img/ag.svg |
4343
website/yarn.lock |
44+
website/docs/tutorial/code-executors.ipynb |
4445
notebook/.*
4546
)$
4647
- repo: https://github.com/nbQA-dev/nbQA

website/.gitattributes

+11
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
docs/Tutorial/code_executor_files/figure-markdown_strict/cell-8-output-1.png filter=lfs diff=lfs merge=lfs -text
2+
docs/Tutorial/assets/Human-in-the-loop.png filter=lfs diff=lfs merge=lfs -text
3+
docs/Tutorial/assets/conversable-agent.png filter=lfs diff=lfs merge=lfs -text
4+
docs/Tutorial/.cache/41/cache.db filter=lfs diff=lfs merge=lfs -text
5+
docs/tutorial/assets/nested-chats.png filter=lfs diff=lfs merge=lfs -text
6+
docs/tutorial/assets/sequential-two-agent-chat.png filter=lfs diff=lfs merge=lfs -text
7+
docs/tutorial/assets/two-agent-chat.png filter=lfs diff=lfs merge=lfs -text
8+
docs/tutorial/assets/code-execution-in-conversation.png filter=lfs diff=lfs merge=lfs -text
9+
docs/tutorial/assets/code-executor-docker.png filter=lfs diff=lfs merge=lfs -text
10+
docs/tutorial/assets/code-executor-no-docker.png filter=lfs diff=lfs merge=lfs -text
11+
docs/tutorial/assets/group-chat.png filter=lfs diff=lfs merge=lfs -text

website/.gitignore

+4-1
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,11 @@ package-lock.json
1111
docs/reference
1212
/docs/notebooks
1313

14+
docs/tutorial/*.mdx
15+
docs/tutorial/**/*.png
16+
!docs/tutorial/assets/*.png
1417
docs/topics/llm_configuration.mdx
15-
docs/topics/code-execution/jupyter-code-executor.mdx
18+
docs/topics/code-execution/*.mdx
1619

1720
# Misc
1821
.DS_Store

website/docs/Getting-Started.md

-78
This file was deleted.

website/docs/Getting-Started.mdx

+138
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,138 @@
1+
import Tabs from '@theme/Tabs';
2+
import TabItem from '@theme/TabItem';
3+
4+
# Getting Started
5+
6+
AutoGen is a framework that enables development of LLM applications using
7+
multiple agents that can converse with each other to solve tasks. AutoGen agents
8+
are customizable, conversable, and seamlessly allow human participation. They
9+
can operate in various modes that employ combinations of LLMs, human inputs, and
10+
tools.
11+
12+
![AutoGen Overview](/img/autogen_agentchat.png)
13+
14+
### Main Features
15+
16+
17+
- AutoGen enables building next-gen LLM applications based on [multi-agent
18+
conversations](/docs/Use-Cases/agent_chat) with minimal effort. It simplifies
19+
the orchestration, automation, and optimization of a complex LLM workflow. It
20+
maximizes the performance of LLM models and overcomes their weaknesses.
21+
- It supports [diverse conversation
22+
patterns](/docs/Use-Cases/agent_chat#supporting-diverse-conversation-patterns)
23+
for complex workflows. With customizable and conversable agents, developers can
24+
use AutoGen to build a wide range of conversation patterns concerning
25+
conversation autonomy, the number of agents, and agent conversation topology.
26+
- It provides a collection of working systems with different complexities. These
27+
systems span a [wide range of
28+
applications](/docs/Use-Cases/agent_chat#diverse-applications-implemented-with-autogen)
29+
from various domains and complexities. This demonstrates how AutoGen can
30+
easily support diverse conversation patterns.
31+
32+
AutoGen is powered by collaborative [research studies](/docs/Research) from
33+
Microsoft, Penn State University, and University of Washington.
34+
35+
### Quickstart
36+
37+
```sh
38+
pip install pyautogen
39+
```
40+
41+
<Tabs>
42+
<TabItem value="local" label="Local execution" default>
43+
:::warning
44+
When asked, be sure to check the generated code before continuing to ensure it is safe to run.
45+
:::
46+
47+
```python
48+
from autogen import AssistantAgent, UserProxyAgent
49+
from autogen.coding import LocalCommandLineCodeExecutor
50+
51+
import os
52+
from pathlib import Path
53+
54+
llm_config = {
55+
"config_list": [{"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}],
56+
}
57+
58+
work_dir = Path("coding")
59+
work_dir.mkdir(exist_ok=True)
60+
61+
assistant = AssistantAgent("assistant", llm_config=llm_config)
62+
63+
code_executor = LocalCommandLineCodeExecutor(work_dir=work_dir)
64+
user_proxy = UserProxyAgent(
65+
"user_proxy", code_execution_config={"executor": code_executor}
66+
)
67+
68+
# Start the chat
69+
user_proxy.initiate_chat(
70+
assistant,
71+
message="Plot a chart of NVDA and TESLA stock price change YTD.",
72+
)
73+
```
74+
75+
</TabItem>
76+
<TabItem value="docker" label="Docker execution" default>
77+
78+
79+
```python
80+
from autogen import AssistantAgent, UserProxyAgent
81+
from autogen.coding import DockerCommandLineCodeExecutor
82+
83+
import os
84+
from pathlib import Path
85+
86+
llm_config = {
87+
"config_list": [{"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}],
88+
}
89+
90+
work_dir = Path("coding")
91+
work_dir.mkdir(exist_ok=True)
92+
93+
with DockerCommandLineCodeExecutor(work_dir=work_dir) as code_executor:
94+
assistant = AssistantAgent("assistant", llm_config=llm_config)
95+
user_proxy = UserProxyAgent(
96+
"user_proxy", code_execution_config={"executor": code_executor}
97+
)
98+
99+
# Start the chat
100+
user_proxy.initiate_chat(
101+
assistant,
102+
message="Plot a chart of NVDA and TESLA stock price change YTD. Save the plot to a file called plot.png",
103+
)
104+
```
105+
106+
Open `coding/plot.png` to see the generated plot.
107+
108+
</TabItem>
109+
</Tabs>
110+
111+
:::tip
112+
Learn more about configuring LLMs for agents [here](/docs/topics/llm_configuration).
113+
:::
114+
115+
116+
117+
#### Multi-Agent Conversation Framework
118+
Autogen enables the next-gen LLM applications with a generic multi-agent conversation framework. It offers customizable and conversable agents which integrate LLMs, tools, and humans.
119+
By automating chat among multiple capable agents, one can easily make them collectively perform tasks autonomously or with human feedback, including tasks that require using tools via code. For [example](https://github.com/microsoft/autogen/blob/main/test/twoagent.py),
120+
121+
The figure below shows an example conversation flow with AutoGen.
122+
123+
![Agent Chat Example](/img/chat_example.png)
124+
125+
126+
### Where to Go Next?
127+
128+
* Go through the [tutorial](/docs/tutorial/introduction) to learn more about the core concepts in AutoGen
129+
* Read the examples and guides in the [notebooks section](/docs/notebooks)
130+
* Understand the use cases for [multi-agent conversation](/docs/Use-Cases/agent_chat) and [enhanced LLM inference](/docs/Use-Cases/enhanced_inference)
131+
* Read the [API](/docs/reference/agentchat/conversable_agent/) docs
132+
* Learn about [research](/docs/Research) around AutoGen
133+
* Chat on [Discord](https://discord.gg/pAbnFJrkgZ)
134+
* Follow on [Twitter](https://twitter.com/pyautogen)
135+
136+
If you like our project, please give it a [star](https://github.com/microsoft/autogen/stargazers) on GitHub. If you are interested in contributing, please read [Contributor's Guide](/docs/Contribute).
137+
138+
<iframe src="https://ghbtns.com/github-btn.html?user=microsoft&amp;repo=autogen&amp;type=star&amp;count=true&amp;size=large" frameborder="0" scrolling="0" width="170" height="30" title="GitHub"></iframe>
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading
Loading

0 commit comments

Comments
 (0)