Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

simplify getting-started; update news #2175

Merged
merged 3 commits into from
Mar 28, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 7 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,23 +12,25 @@
<img src="https://github.com/microsoft/autogen/blob/main/website/static/img/flaml.svg" width=200>
<br>
</p> -->
:fire: Mar 26: Andrew Ng gave a shoutout to AutoGen in [What's next for AI agentic workflows](https://youtu.be/sal78ACtGTc?si=JduUzN_1kDnMq0vF) at Sequoia Capital's AI Ascent.
sonichi marked this conversation as resolved.
Show resolved Hide resolved

:fire: Mar 3: What's new in AutoGen? 📰[Blog](https://microsoft.github.io/autogen/blog/2024/03/03/AutoGen-Update); 📺[Youtube](https://www.youtube.com/watch?v=j_mtwQiaLGU).

:fire: Mar 1: the first AutoGen multi-agent experiment on the challenging [GAIA](https://huggingface.co/spaces/gaia-benchmark/leaderboard) benchmark achieved the No. 1 accuracy in all the three levels.

:fire: Jan 30: AutoGen is highlighted by Peter Lee in Microsoft Research Forum [Keynote](https://t.co/nUBSjPDjqD).
:tada: Jan 30: AutoGen is highlighted by Peter Lee in Microsoft Research Forum [Keynote](https://t.co/nUBSjPDjqD).

:fire: Dec 31: [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework](https://arxiv.org/abs/2308.08155) is selected by [TheSequence: My Five Favorite AI Papers of 2023](https://thesequence.substack.com/p/my-five-favorite-ai-papers-of-2023).
:tada: Dec 31: [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework](https://arxiv.org/abs/2308.08155) is selected by [TheSequence: My Five Favorite AI Papers of 2023](https://thesequence.substack.com/p/my-five-favorite-ai-papers-of-2023).

<!-- :fire: Nov 24: pyautogen [v0.2](https://github.com/microsoft/autogen/releases/tag/v0.2.0) is released with many updates and new features compared to v0.1.1. It switches to using openai-python v1. Please read the [migration guide](https://microsoft.github.io/autogen/docs/Installation#python). -->

<!-- :fire: Nov 11: OpenAI's Assistants are available in AutoGen and interoperatable with other AutoGen agents! Checkout our [blogpost](https://microsoft.github.io/autogen/blog/2023/11/13/OAI-assistants) for details and examples. -->

:fire: Nov 8: AutoGen is selected into [Open100: Top 100 Open Source achievements](https://www.benchcouncil.org/evaluation/opencs/annual.html) 35 days after spinoff.
:tada: Nov 8: AutoGen is selected into [Open100: Top 100 Open Source achievements](https://www.benchcouncil.org/evaluation/opencs/annual.html) 35 days after spinoff.

:fire: Nov 6: AutoGen is mentioned by Satya Nadella in a [fireside chat](https://youtu.be/0pLBvgYtv6U).
:tada: Nov 6: AutoGen is mentioned by Satya Nadella in a [fireside chat](https://youtu.be/0pLBvgYtv6U).

:fire: Nov 1: AutoGen is the top trending repo on GitHub in October 2023.
:tada: Nov 1: AutoGen is the top trending repo on GitHub in October 2023.

:tada: Oct 03: AutoGen spins off from FLAML on GitHub and has a major paper update (first version on Aug 16).

Expand Down
3 changes: 1 addition & 2 deletions autogen/coding/docker_commandline_code_executor.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,8 +83,7 @@ def __init__(
if isinstance(work_dir, str):
work_dir = Path(work_dir)

if not work_dir.exists():
raise ValueError(f"Working directory {work_dir} does not exist.")
work_dir.mkdir(exist_ok=True)

client = docker.from_env()

Expand Down
3 changes: 1 addition & 2 deletions autogen/coding/local_commandline_code_executor.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,8 +71,7 @@ def __init__(
if isinstance(work_dir, str):
work_dir = Path(work_dir)

if not work_dir.exists():
raise ValueError(f"Working directory {work_dir} does not exist.")
work_dir.mkdir(exist_ok=True)

self._timeout = timeout
self._work_dir: Path = work_dir
Expand Down
2 changes: 1 addition & 1 deletion autogen/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "0.2.20"
__version__ = "0.2.21"
6 changes: 4 additions & 2 deletions test/coding/test_commandline_code_executor.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from pathlib import Path
import sys
import os
import tempfile
import uuid
import pytest
Expand All @@ -10,7 +11,8 @@
from autogen.coding.docker_commandline_code_executor import DockerCommandLineCodeExecutor
from autogen.coding.local_commandline_code_executor import LocalCommandLineCodeExecutor

from conftest import MOCK_OPEN_AI_API_KEY, skip_docker
sys.path.append(os.path.join(os.path.dirname(__file__), ".."))
from conftest import MOCK_OPEN_AI_API_KEY, skip_docker # noqa: E402

if skip_docker or not is_docker_running():
classes_to_test = [LocalCommandLineCodeExecutor]
Expand Down Expand Up @@ -52,7 +54,7 @@ def test_commandline_executor_init(cls) -> None:
assert executor.timeout == 10 and str(executor.work_dir) == "."

# Try invalid working directory.
with pytest.raises(ValueError, match="Working directory .* does not exist."):
with pytest.raises(FileNotFoundError):
executor = cls(timeout=111, work_dir="/invalid/directory")


Expand Down
49 changes: 24 additions & 25 deletions website/docs/Getting-Started.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -38,30 +38,37 @@ pip install pyautogen
```

<Tabs>
<TabItem value="nocode" label="No code execution" default>

```python
from autogen import AssistantAgent, UserProxyAgent

llm_config = {"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}
assistant = AssistantAgent("assistant", llm_config=llm_config)
user_proxy = UserProxyAgent("user_proxy", code_execution_config=False)

# Start the chat
user_proxy.initiate_chat(
assistant,
message="Tell me a joke about NVDA and TESLA stock prices.",
)
```

</TabItem>
<TabItem value="local" label="Local execution" default>
:::warning
When asked, be sure to check the generated code before continuing to ensure it is safe to run.
:::

```python
import autogen
from autogen import AssistantAgent, UserProxyAgent
from autogen.coding import LocalCommandLineCodeExecutor

import os
from pathlib import Path

llm_config = {
"config_list": [{"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}],
}

work_dir = Path("coding")
work_dir.mkdir(exist_ok=True)

llm_config = {"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}
assistant = AssistantAgent("assistant", llm_config=llm_config)

code_executor = LocalCommandLineCodeExecutor(work_dir=work_dir)
user_proxy = UserProxyAgent(
"user_proxy", code_execution_config={"executor": code_executor}
"user_proxy", code_execution_config={"executor": autogen.coding.LocalCommandLineCodeExecutor(work_dir="coding")}
)

# Start the chat
Expand All @@ -75,20 +82,12 @@ user_proxy.initiate_chat(
<TabItem value="docker" label="Docker execution" default>

```python
import autogen
from autogen import AssistantAgent, UserProxyAgent
from autogen.coding import DockerCommandLineCodeExecutor

import os
from pathlib import Path

llm_config = {
"config_list": [{"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}],
}
llm_config = {"model": "gpt-4", "api_key": os.environ["OPENAI_API_KEY"]}

work_dir = Path("coding")
work_dir.mkdir(exist_ok=True)

with DockerCommandLineCodeExecutor(work_dir=work_dir) as code_executor:
with autogen.coding.DockerCommandLineCodeExecutor(work_dir="coding") as code_executor:
assistant = AssistantAgent("assistant", llm_config=llm_config)
user_proxy = UserProxyAgent(
"user_proxy", code_execution_config={"executor": code_executor}
Expand All @@ -103,7 +102,7 @@ with DockerCommandLineCodeExecutor(work_dir=work_dir) as code_executor:

Open `coding/plot.png` to see the generated plot.

</TabItem>
</TabItem>
</Tabs>

:::tip
Expand Down
Loading