Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split apart ecosystem page, update sidebar, other website tweaks #1812

Merged
merged 4 commits into from
Mar 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 0 additions & 29 deletions website/docs/Ecosystem.md

This file was deleted.

22 changes: 3 additions & 19 deletions website/docs/FAQ.md → website/docs/FAQ.mdx
Original file line number Diff line number Diff line change
@@ -1,24 +1,8 @@
import TOCInline from '@theme/TOCInline';

# Frequently Asked Questions

- [Install the correct package - `pyautogen`](#install-the-correct-package---pyautogen)
- [Set your API endpoints](#set-your-api-endpoints)
- [Use the constructed configuration list in agents](#use-the-constructed-configuration-list-in-agents)
- [Unexpected keyword argument 'base_url'](#unexpected-keyword-argument-base_url)
- [How does an agent decide which model to pick out of the list?](#how-does-an-agent-decide-which-model-to-pick-out-of-the-list)
- [Can I use non-OpenAI models?](#can-i-use-non-openai-models)
- [Handle Rate Limit Error and Timeout Error](#handle-rate-limit-error-and-timeout-error)
- [How to continue a finished conversation](#how-to-continue-a-finished-conversation)
- [How do we decide what LLM is used for each agent? How many agents can be used? How do we decide how many agents in the group?](#how-do-we-decide-what-llm-is-used-for-each-agent-how-many-agents-can-be-used-how-do-we-decide-how-many-agents-in-the-group)
- [Why is code not saved as file?](#why-is-code-not-saved-as-file)
- [Code execution](#code-execution)
- [Enable Python 3 docker image](#enable-python-3-docker-image)
- [Agents keep thanking each other when using `gpt-3.5-turbo`](#agents-keep-thanking-each-other-when-using-gpt-35-turbo)
- [ChromaDB fails in codespaces because of old version of sqlite3](#chromadb-fails-in-codespaces-because-of-old-version-of-sqlite3)
- [How to register a reply function](#how-to-register-a-reply-function)
- [How to get last message?](#how-to-get-last-message)
- [How to get each agent message?](#how-to-get-each-agent-message)
- [When using autogen docker, is it always necessary to reinstall modules?](#when-using-autogen-docker-is-it-always-necessary-to-reinstall-modules)
- [Agents are throwing due to docker not running, how can I resolve this?](#agents-are-throwing-due-to-docker-not-running-how-can-i-resolve-this)
<TOCInline toc={toc} />

## Install the correct package - `pyautogen`

Expand Down
7 changes: 7 additions & 0 deletions website/docs/ecosystem/memgpt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# MemGPT

![MemGPT Example](img/ecosystem-memgpt.png)

MemGPT enables LLMs to manage their own memory and overcome limited context windows. You can use MemGPT to create perpetual chatbots that learn about you and modify their own personalities over time. You can connect MemGPT to your own local filesystems and databases, as well as connect MemGPT to your own tools and APIs. The MemGPT + AutoGen integration allows you to equip any AutoGen agent with MemGPT capabilities.

- [MemGPT + AutoGen Documentation with Code Examples](https://memgpt.readme.io/docs/autogen)
7 changes: 7 additions & 0 deletions website/docs/ecosystem/microsoft-fabric.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Microsoft Fabric

![Fabric Example](img/ecosystem-fabric.png)

[Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/get-started/microsoft-fabric-overview) is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. In this notenook, we give a simple example for using AutoGen in Microsoft Fabric.

- [Microsoft Fabric + AutoGen Code Examples](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_microsoft_fabric.ipynb)
7 changes: 7 additions & 0 deletions website/docs/ecosystem/ollama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Ollama

![Ollama Example](img/ecosystem-ollama.png)

[Ollama](https://ollama.com/) allows the users to run open-source large language models, such as Llama 2, locally. Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. It optimizes setup and configuration details, including GPU usage.

- [Ollama + AutoGen instruction](https://ollama.ai/blog/openai-compatibility)
2 changes: 1 addition & 1 deletion website/docs/installation/Installation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ pip install pyautogen

## Code execution with Docker (default)

Even if you install AutoGen locally, we highly recommend using Docker for [code execution](FAQ.md#code-execution).
Even if you install AutoGen locally, we highly recommend using Docker for [code execution](FAQ.mdx#code-execution).

The default behaviour for code-execution agents is for code execution to be performed in a docker container.

Expand Down
25 changes: 12 additions & 13 deletions website/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ module.exports = {
type: "doc",
docId: "reference/agentchat/conversable_agent",
position: "left",
label: "SDK",
label: "API",
},
{ to: "blog", label: "Blog", position: "left" },
{
Expand All @@ -76,18 +76,9 @@ module.exports = {
// label: "Notebooks",
// },
{
label: "Resources",
type: "dropdown",
items: [
{
type: "doc",
docId: "Ecosystem",
},
{
type: "doc",
docId: "Gallery",
},
],
type: "doc",
position: "left",
docId: "Gallery",
},
{
label: "Other Languages",
Expand Down Expand Up @@ -195,6 +186,14 @@ module.exports = {
to: "/docs/llm_configuration/",
from: ["/docs/llm_endpoint_configuration/"],
},
{
to: "/docs/ecosystem/memgpt/",
from: ["/docs/Ecosystem"],
},
{
to: "/docs/Getting-Started",
from: ["/docs/"],
},
],
},
]
Expand Down
1 change: 1 addition & 0 deletions website/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
{'Use Cases': [{type: 'autogenerated', dirName: 'Use-Cases'}]},
'Contribute',
'Research',
{'Ecosystem': [{type: 'autogenerated', dirName: 'ecosystem'}]},
'Migration-Guide'
],
// pydoc-markdown auto-generated markdowns from docstrings
Expand Down
Loading