You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
chat_queue (List[Dict]): a list of dictionaries containing the information of the chats.
37
-
Each dictionary should contain the following fields:
37
+
Each dictionary should contain the input arguments for `ConversableAgent.initiate_chat`.
38
+
More specifically, each dictionary could include the following fields:
39
+
recipient: the recipient agent.
40
+
- "sender": the sender agent.
38
41
- "recipient": the recipient agent.
39
-
- "context": any context information, e.g., the request message. The following fields are reserved:
40
-
"message" needs to be provided if the `generate_init_message` method is not overridden.
41
-
Otherwise, input() will be called to get the initial message.
42
-
"summary_method": a string or callable specifying the method to get a summary from the chat. Default is DEFAULT_summary_method, i.e., "last_msg".
43
-
- Supported string are "last_msg" and "reflection_with_llm":
44
-
when set "last_msg", it returns the last message of the dialog as the summary.
45
-
when set "reflection_with_llm", it returns a summary extracted using an llm client.
46
-
`llm_config` must be set in either the recipient or sender.
47
-
"reflection_with_llm" requires the llm_config to be set in either the sender or the recipient.
48
-
- A callable summary_method should take the recipient and sender agent in a chat as input and return a string of summary. E.g,
49
-
```python
50
-
def my_summary_method(
51
-
sender: ConversableAgent,
52
-
recipient: ConversableAgent,
53
-
):
54
-
return recipient.last_message(sender)["content"]
55
-
```
56
-
"summary_prompt" can be used to specify the prompt used to extract a summary when summary_method is "reflection_with_llm".
57
-
Default is None and the following default prompt will be used when "summary_method" is set to "reflection_with_llm":
58
-
"Identify and extract the final solution to the originally asked question based on the conversation."
59
-
"carryover" can be used to specify the carryover information to be passed to this chat.
60
-
If provided, we will combine this carryover with the "message" content when generating the initial chat
61
-
message in `generate_init_message`.
42
+
- clear_history (bool): whether to clear the chat history with the agent. Default is True.
43
+
- silent (bool or None): (Experimental) whether to print the messages for this conversation. Default is False.
44
+
- cache (Cache or None): the cache client to be used for this conversation. Default is None.
45
+
- max_turns (int or None): the maximum number of turns for the chat. If None, the chat will continue until a termination condition is met. Default is None.
46
+
- "message" needs to be provided if the `generate_init_message` method is not overridden.
47
+
Otherwise, input() will be called to get the initial message.
48
+
- "summary_method": a string or callable specifying the method to get a summary from the chat. Default is DEFAULT_summary_method, i.e., "last_msg".
49
+
- Supported string are "last_msg" and "reflection_with_llm":
50
+
when set "last_msg", it returns the last message of the dialog as the summary.
51
+
when set "reflection_with_llm", it returns a summary extracted using an llm client.
52
+
`llm_config` must be set in either the recipient or sender.
53
+
"reflection_with_llm" requires the llm_config to be set in either the sender or the recipient.
54
+
- A callable summary_method should take the recipient and sender agent in a chat as input and return a string of summary. E.g,
55
+
```python
56
+
def my_summary_method(
57
+
sender: ConversableAgent,
58
+
recipient: ConversableAgent,
59
+
):
60
+
return recipient.last_message(sender)["content"]
61
+
```
62
+
"summary_prompt" can be used to specify the prompt used to extract a summary when summary_method is "reflection_with_llm".
63
+
Default is None and the following default prompt will be used when "summary_method" is set to "reflection_with_llm":
64
+
"Identify and extract the final solution to the originally asked question based on the conversation."
65
+
"carryover" can be used to specify the carryover information to be passed to this chat.
66
+
If provided, we will combine this carryover with the "message" content when generating the initial chat
"""Raise an exception if any async reply functions are registered.
@@ -763,6 +755,7 @@ def initiate_chat(
763
755
clear_history: Optional[bool] =True,
764
756
silent: Optional[bool] =False,
765
757
cache: Optional[Cache] =None,
758
+
max_turns: Optional[int] =None,
766
759
**context,
767
760
) ->ChatResult:
768
761
"""Initiate a chat with the recipient agent.
@@ -773,9 +766,12 @@ def initiate_chat(
773
766
774
767
Args:
775
768
recipient: the recipient agent.
776
-
clear_history (bool): whether to clear the chat history with the agent.
777
-
silent (bool or None): (Experimental) whether to print the messages for this conversation.
778
-
cache (Cache or None): the cache client to be used for this conversation.
769
+
clear_history (bool): whether to clear the chat history with the agent. Default is True.
770
+
silent (bool or None): (Experimental) whether to print the messages for this conversation. Default is False.
771
+
cache (Cache or None): the cache client to be used for this conversation. Default is None.
772
+
max_turns (int or None): the maximum number of turns for the chat between the two agents. One turn means one conversation round trip. Note that this is different from
773
+
[max_consecutive_auto_reply](#max_consecutive_auto_reply) which is the maximum number of consecutive auto replies; and it is also different from [max_rounds in GroupChat](./groupchat#groupchat-objects) which is the maximum number of rounds in a group chat session.
774
+
If max_turns is set to None, the chat will continue until a termination condition is met. Default is None.
779
775
**context: any context information. It has the following reserved fields:
780
776
"message": a str of message. Needs to be provided. Otherwise, input() will be called to get the initial message.
781
777
"summary_method": a string or callable specifying the method to get a summary from the chat. Default is DEFAULT_summary_method, i.e., "last_msg".
0 commit comments