diff --git a/website/docs/Ecosystem.md b/website/docs/Ecosystem.md index 63cdd93fcd7f..88e3236ddf50 100644 --- a/website/docs/Ecosystem.md +++ b/website/docs/Ecosystem.md @@ -19,3 +19,11 @@ MemGPT enables LLMs to manage their own memory and overcome limited context wind [Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/get-started/microsoft-fabric-overview) is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. In this notenook, we give a simple example for using AutoGen in Microsoft Fabric. - [Microsoft Fabric + AutoGen Code Examples](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_microsoft_fabric.ipynb) + +## Ollama + AutoGen + +![Ollama Example](img/ecosystem-ollama.png) + +[Ollama](https://ollama.com/) allows the users to run open-source large language models, such as Llama 2, locally. Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile. It optimizes setup and configuration details, including GPU usage. + +- [Ollama + AutoGen instruction](https://ollama.ai/blog/openai-compatibility) diff --git a/website/docs/img/ecosystem-ollama.png b/website/docs/img/ecosystem-ollama.png new file mode 100644 index 000000000000..1cd707b7c98b Binary files /dev/null and b/website/docs/img/ecosystem-ollama.png differ