Skip to content

Commit 8023370

Browse files
authored
Move LLM Caching docs to topics (#1950)
* Move LLM Caching docs to topics * Update llm-caching.md
1 parent 4a85b63 commit 8023370

File tree

2 files changed

+49
-50
lines changed

2 files changed

+49
-50
lines changed

website/docs/Use-Cases/agent_chat.md

-50
Original file line numberDiff line numberDiff line change
@@ -327,56 +327,6 @@ With the pluggable auto-reply function, one can choose to invoke conversations w
327327

328328
Another approach involves LLM-based function calls, where LLM decides if a specific function should be invoked based on the conversation's status during each inference. This approach enables dynamic multi-agent conversations, as seen in scenarios like [multi-user math problem solving scenario](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_two_users.ipynb), where a student assistant automatically seeks expertise via function calls.
329329

330-
### LLM Caching
331-
332-
Since version 0.2.8, a configurable context manager allows you to easily
333-
configure LLM cache, using either DiskCache or Redis. All agents inside the
334-
context manager will use the same cache.
335-
336-
```python
337-
from autogen import Cache
338-
339-
# Use Redis as cache
340-
with Cache.redis(redis_url="redis://localhost:6379/0") as cache:
341-
user.initiate_chat(assistant, message=coding_task, cache=cache)
342-
343-
# Use DiskCache as cache
344-
with Cache.disk() as cache:
345-
user.initiate_chat(assistant, message=coding_task, cache=cache)
346-
```
347-
348-
You can vary the `cache_seed` parameter to get different LLM output while
349-
still using cache.
350-
351-
```python
352-
# Setting the cache_seed to 1 will use a different cache from the default one
353-
# and you will see different output.
354-
with Cache.disk(cache_seed=1) as cache:
355-
user.initiate_chat(assistant, message=coding_task, cache=cache)
356-
```
357-
358-
By default DiskCache uses `.cache` for storage. To change the cache directory,
359-
set `cache_path_root`:
360-
361-
```python
362-
with Cache.disk(cache_path_root="/tmp/autogen_cache") as cache:
363-
user.initiate_chat(assistant, message=coding_task, cache=cache)
364-
```
365-
366-
For backward compatibility, DiskCache is on by default with `cache_seed` set to 41.
367-
To disable caching completely, set `cache_seed` to `None` in the `llm_config` of the agent.
368-
369-
```python
370-
assistant = AssistantAgent(
371-
"coding_agent",
372-
llm_config={
373-
"cache_seed": None,
374-
"config_list": OAI_CONFIG_LIST,
375-
"max_tokens": 1024,
376-
},
377-
)
378-
```
379-
380330
### Diverse Applications Implemented with AutoGen
381331

382332
The figure below shows six examples of applications built using AutoGen.

website/docs/topics/llm-caching.md

+49
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
# LLM Caching
2+
3+
Since version [`0.2.8`](https://github.com/microsoft/autogen/releases/tag/v0.2.8), a configurable context manager allows you to easily
4+
configure LLM cache, using either [`DiskCache`](/docs/reference/cache/disk_cache#diskcache) or [`RedisCache`](/docs/reference/cache/redis_cache#rediscache). All agents inside the
5+
context manager will use the same cache.
6+
7+
```python
8+
from autogen import Cache
9+
10+
# Use Redis as cache
11+
with Cache.redis(redis_url="redis://localhost:6379/0") as cache:
12+
user.initiate_chat(assistant, message=coding_task, cache=cache)
13+
14+
# Use DiskCache as cache
15+
with Cache.disk() as cache:
16+
user.initiate_chat(assistant, message=coding_task, cache=cache)
17+
```
18+
19+
You can vary the `cache_seed` parameter to get different LLM output while
20+
still using cache.
21+
22+
```python
23+
# Setting the cache_seed to 1 will use a different cache from the default one
24+
# and you will see different output.
25+
with Cache.disk(cache_seed=1) as cache:
26+
user.initiate_chat(assistant, message=coding_task, cache=cache)
27+
```
28+
29+
By default [`DiskCache`](/docs/reference/cache/disk_cache#diskcache) uses `.cache` for storage. To change the cache directory,
30+
set `cache_path_root`:
31+
32+
```python
33+
with Cache.disk(cache_path_root="/tmp/autogen_cache") as cache:
34+
user.initiate_chat(assistant, message=coding_task, cache=cache)
35+
```
36+
37+
For backward compatibility, [`DiskCache`](/docs/reference/cache/disk_cache#diskcache) is on by default with `cache_seed` set to 41.
38+
To disable caching completely, set `cache_seed` to `None` in the `llm_config` of the agent.
39+
40+
```python
41+
assistant = AssistantAgent(
42+
"coding_agent",
43+
llm_config={
44+
"cache_seed": None,
45+
"config_list": OAI_CONFIG_LIST,
46+
"max_tokens": 1024,
47+
},
48+
)
49+
```

0 commit comments

Comments
 (0)