Skip to content

Commit ed3188f

Browse files
authored
1 parent b071a6d commit ed3188f

File tree

1 file changed

+0
-4
lines changed

1 file changed

+0
-4
lines changed

website/docs/topics/long_contexts.md

-4
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,6 @@ Why do we need to handle long contexts? The problem arises from several constrai
1212

1313
The `TransformMessages` capability is designed to modify incoming messages before they are processed by the LLM agent. This can include limiting the number of messages, truncating messages to meet token limits, and more.
1414

15-
````{=mdx}
1615
:::info Requirements
1716
Install `pyautogen`:
1817
```bash
@@ -21,7 +20,6 @@ pip install pyautogen
2120

2221
For more information, please refer to the [installation guide](/docs/installation/).
2322
:::
24-
````
2523

2624
### Exploring and Understanding Transformations
2725

@@ -114,11 +112,9 @@ user_proxy = autogen.UserProxyAgent(
114112
)
115113
```
116114

117-
```{=mdx}
118115
:::tip
119116
Learn more about configuring LLMs for agents [here](/docs/topics/llm_configuration).
120117
:::
121-
```
122118

123119
We first need to write the `test` function that creates a very long chat history by exchanging messages between an assistant and a user proxy agent, and then attempts to initiate a new chat without clearing the history, potentially triggering an error due to token limits.
124120

0 commit comments

Comments
 (0)