From 5f4ea5fee2cdd41679697a3a543ea97dc3176e5f Mon Sep 17 00:00:00 2001 From: Chi Wang Date: Sun, 1 Oct 2023 15:25:08 +0000 Subject: [PATCH 1/3] expand faq --- website/docs/FAQ.md | 21 +++++++++++++++++++++ 1 file changed, 21 insertions(+) diff --git a/website/docs/FAQ.md b/website/docs/FAQ.md index 3cd0ae048feb..56ef8bc15a78 100644 --- a/website/docs/FAQ.md +++ b/website/docs/FAQ.md @@ -109,3 +109,24 @@ You can set `retry_wait_time` and `max_retry_period` to handle rate limit error. - `request_timeout` (int): the timeout (in seconds) sent with a single request. Please refer to the [documentation](/docs/Use-Cases/enhanced_inference#runtime-error) for more info. + +## How to continue a finished conversation + +When you call `initiate_chat` the conversation restarts by default. You can use `send` or `initiate_chat(clear_history=False)` to continue the conversation. + +## How do we decide what LLM is used for each agent? How many agents can be used? How do we decide how many agents in the group? + +Each agent can be customized. You can use LLMs, tools or human behind each agent. If you use an LLM for an agent, use the one best suited for its role. There is no limit of the number of agents, but start from a small number like 2, 3. The more capable is the LLM and the fewer roles you need, the fewer agents you need. + +The default user proxy agent doesn't use LLM. If you'd like to use an LLM in UserProxyAgent, the use case could be to simulate user's behavior. + +The default assistant agent is instructed to use both coding and language skills. It doesn't have to do coding, depending on the tasks. And you can customize the system message. So if you want to use it for coding, use a model that's good at coding. + +## Why is code not saved as file? + +If you are using a custom system message for the coding agent, please include something like: +`If you want the user to save the code in a file before executing it, put # filename: inside the code block as the first line.` +in the system message. This line is in the default system message of the `AssistantAgent`. + +If the `# filename` doesn't appear in the suggested code still, consider adding explicit instructions such as "save the code to disk" in the initial user message in `initiate_chat`. +The `AssistantAgent` doesn't save all the code by default, because there are cases in which one would just like to finish a task without saving the code. From 2f1e6c46a0f0debd372742e02540bc00806fc09f Mon Sep 17 00:00:00 2001 From: Chi Wang Date: Sun, 1 Oct 2023 21:01:53 +0000 Subject: [PATCH 2/3] models --- website/docs/FAQ.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/website/docs/FAQ.md b/website/docs/FAQ.md index 56ef8bc15a78..8cac002b3a12 100644 --- a/website/docs/FAQ.md +++ b/website/docs/FAQ.md @@ -100,6 +100,10 @@ You can also explicitly specify that by: assistant = autogen.AssistantAgent(name="assistant", llm_config={"api_key": ...}) ``` +### Can I use non-OpenAI models? + +Yes. Please check https://microsoft.github.io/autogen/blog/2023/07/14/Local-LLMs for an example. + ## Handle Rate Limit Error and Timeout Error You can set `retry_wait_time` and `max_retry_period` to handle rate limit error. And you can set `request_timeout` to handle timeout error. They can all be specified in `llm_config` for an agent, which will be used in the [`create`](/docs/reference/oai/completion#create) function for LLM inference. From be5d21d61766d754eddd6b9d91f11a08ac8c04dd Mon Sep 17 00:00:00 2001 From: Chi Wang Date: Mon, 2 Oct 2023 01:20:41 +0000 Subject: [PATCH 3/3] fix format error --- README.md | 10 ++++------ website/docusaurus.config.js | 4 ++-- website/static/img/ag.ico | Bin 0 -> 3126 bytes website/static/img/ag.svg | 1 + 4 files changed, 7 insertions(+), 8 deletions(-) create mode 100644 website/static/img/ag.ico create mode 100644 website/static/img/ag.svg diff --git a/README.md b/README.md index 77c51f96585a..c552f5f1c4f0 100644 --- a/README.md +++ b/README.md @@ -150,9 +150,9 @@ Microsoft and any contributors reserve all other rights, whether under their res or trademarks, whether by implication, estoppel or otherwise. -## Citation -[AutoGen](https://arxiv.org/abs/2308.08155). -``` +## Citation +[AutoGen](https://arxiv.org/abs/2308.08155). +``` @inproceedings{wu2023autogen, title={AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework}, author={Qingyun Wu and Gagan Bansal and Jieyu Zhang and Yiran Wu and Shaokun Zhang and Erkang Zhu and Beibin Li and Li Jiang and Xiaoyun Zhang and Chi Wang}, @@ -173,7 +173,7 @@ or trademarks, whether by implication, estoppel or otherwise. } ``` - [MathChat](https://arxiv.org/abs/2306.01337). +[MathChat](https://arxiv.org/abs/2306.01337). ``` @inproceedings{wu2023empirical, @@ -183,5 +183,3 @@ or trademarks, whether by implication, estoppel or otherwise. booktitle={ArXiv preprint arXiv:2306.01337}, } ``` - - diff --git a/website/docusaurus.config.js b/website/docusaurus.config.js index c5f58e5e10b5..8ac69648e2de 100644 --- a/website/docusaurus.config.js +++ b/website/docusaurus.config.js @@ -9,7 +9,7 @@ module.exports = { baseUrl: '/autogen/', onBrokenLinks: 'throw', onBrokenMarkdownLinks: 'warn', - favicon: 'img/flaml_logo.ico', + favicon: 'img/ag.ico', organizationName: 'Microsoft', // Usually your GitHub org/user name. projectName: 'AutoGen', // Usually your repo name. themeConfig: { @@ -17,7 +17,7 @@ module.exports = { title: 'AutoGen', logo: { alt: 'AutoGen', - src: 'img/flaml_logo_fill.svg', + src: 'img/ag.svg', }, items: [ { diff --git a/website/static/img/ag.ico b/website/static/img/ag.ico new file mode 100644 index 0000000000000000000000000000000000000000..f1789673b09252f61aedc8932f2dfecb8cd68e8d GIT binary patch literal 3126 zcmY*bX>3(h5PqO2XhFKr_O*TOqwU-8eYX{g1T`W`SQ3_me?&4pJ3V*qoNs2nnK}0i7}h7s zA%A@^`|)$zzYdN>tni2Thr8)d1rA46IGkmgn6o)d8R5ty7G{H_h*4mHS#dOKxI_tKvGpYih9hdi53MjR z!BJIrqrG~ilz+s{K?MbO{?=)ESR!t*~j(2oK8(jNp+<;6OxjW zTuF(^@iDP!DQVhn~A34;jdak`I2A zI+*rkcwT+!#hMev)7}|aUsKWA(%5pPCV#=?Zx&C@{V;ds=ZojRH=?e#CLHoc$HaH< z+V$k&;<~!(zBxGoua{2Z+>@h^#f+-98&lK4Gb$7a_K1nAIJ9N;^69m8XQH0z^!BLX zKkq5Jbnc{O8i&fuW=$A!yy{2?$J0M<{%+^G{JaUn%F8zQ?cFQr4*-*FN1Bd3E@eaK z5rPbuNOq=tQ}9Lk_LYrIS2EqMWuGmmJ$1OTp}Kq2bLY-hR~_D6T3Xz}(ecdj@_Toh zZ(Tost@%Q>Wd(czf(t&9i$UiltwIi4Mhuzgra`au11)#&-+$%hem~?dy?(X6;?UlX z4#$nF7n&MR@7%flsSZy+xO=JP;;9>#PPaX79XV_WdMDE0bd? z-z5r_keZU#+p82Uz2{r+$Bm+dI} z>SDvm_}F+?vTJ_chv>U2SI*7No0%QX8aaIE`I?H0ml{xv;GYF)mpUn8BQvu&PT9y< zGcQb#8Sn?9yLD^5S=(^73V=%%&Q3{kp+Zue2@md_hotq+wdTg^rgK#{S}x**7kQ!w(0l!AopZ?4PW~1?FIk*e*5vC z4<7&ZTT}h95rYTd;G$#pAh^k%l0zF2@lX<$Bnr)bFc9>(Q~RTHEPqB%IhKW9M1jN2 z85a}GbZ4X{#m2;WGrhvI-9$nQ*uV!4=Rc5HNZJc{WIAjb5s2aU1%m#dI;NSjqh^eu zP!J7Cw`HRZ#WiuXKC#BNO7XdtGek2^eSMU|mKu=Hr6y5JLZEDYtThn&p!{$a>v~+< zYq+G~?R9G}Xi-RRhvho0uzY{QQHv(#G=D45~&<)LFMz#{4Zh~2}=L~ literal 0 HcmV?d00001 diff --git a/website/static/img/ag.svg b/website/static/img/ag.svg new file mode 100644 index 000000000000..9402bbdcab74 --- /dev/null +++ b/website/static/img/ag.svg @@ -0,0 +1 @@ +