Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reset Feature #3896

Closed
Kunjal1999 opened this issue Oct 23, 2024 · 14 comments
Closed

Reset Feature #3896

Kunjal1999 opened this issue Oct 23, 2024 · 14 comments

Comments

@Kunjal1999
Copy link

Kunjal1999 commented Oct 23, 2024

What happened?

Hello, while using the ChatInitiator with

class ChatInitiator:
    def __init__(self, user_proxy: autogen.UserProxyAgent, assistant: autogen.AssistantAgent)

If I want to reset the existing context (or in other words make the LLM forget everything that it remembers so far) if a user inputs RESET, how do I do that?

What did you expect to happen?

Reset

How can we reproduce it (as minimally and precisely as possible)?

Hello, while using the ChatInitiator with

class ChatInitiator:
    def __init__(self, user_proxy: autogen.UserProxyAgent, assistant: autogen.AssistantAgent)

If I want to reset the existing context (or in other words make the LLM forget everything that it remembers so far) if a user inputs RESET, how do I do that?

AutoGen version

Latest one

Which package was this bug in

Core

Model used

No response

Python version

No response

Operating system

No response

Any additional info you think would be helpful for fixing this bug

No response

@qgzhang
Copy link

qgzhang commented Oct 23, 2024

Perhaps check the initiate_chat() function at Line 989 in conversable_agent.py. The parameter 'clear_history' is by default to beTrue to clear the chat history with the peer agent.

@ekzhu
Copy link
Collaborator

ekzhu commented Oct 23, 2024

Dear @Kunjal1999

Please see @qgzhang's comment.
 
Can you please use the markdown code blocks for your code snippet, and the correct version of the package and the full code snippet for reproducing the issue in the future otherwise we may not be able to help you.

Thanks,

@ekzhu ekzhu closed this as completed Oct 23, 2024
@Kunjal1999
Copy link
Author

Hello,

I want an implementation such that when user inputs "RESET", the agent forgets all history.

class AgentFactory:
    @staticmethod
    def create_assistant_agent(llm_config: Dict) -> autogen.AssistantAgent:
        return autogen.AssistantAgent(
            name="assistant",
            llm_config={
                "cache_seed": 42,
                "config_list": llm_config["config_list"],
                "temperature": 0,
            },
        )

   @staticmethod
    def create_user_proxy_agent() -> autogen.UserProxyAgent:
        return autogen.UserProxyAgent(
            name="user_proxy",
            human_input_mode="TERMINATE",
            max_consecutive_auto_reply=1,
            is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),            
            code_execution_config={
                "executor": LocalCommandLineCodeExecutor(work_dir="coding"),
            },
        )

class ChatInitiator:
    def __init__(self, user_proxy: autogen.UserProxyAgent, assistant: autogen.AssistantAgent):
        self.user_proxy = user_proxy
        self.assistant = assistant

    def initiate_chat(self, message: str) -> Dict:
        return self.user_proxy.initiate_chat(
            self.assistant,
            message=message,
            summary_method="reflection_with_llm",            
        )

@Kunjal1999
Copy link
Author

Perhaps check the initiate_chat() function at Line 989 in conversable_agent.py. The parameter 'clear_history' is by default to beTrue to clear the chat history with the peer agent.

It does not get cleared, i.e. the LLM still remembers context. I want a functionality such that on pressing RESET the LLM forgets context.

@qgzhang
Copy link

qgzhang commented Oct 23, 2024 via email

@Kunjal1999
Copy link
Author

I don't think Autogen maintain a context beyond the conversation history, unless... you are talking about the cache?

On Thu, 24 Oct 2024, 07:15 Kunjal1999, @.> wrote: Perhaps check the initiate_chat() function at Line 989 in conversable_agent.py. The parameter 'clear_history' is by default to beTrue to clear the chat history with the peer agent. It does not get cleared, i.e. the LLM still remembers context. I want a functionality such that on pressing RESET the LLM forgets context. — Reply to this email directly, view it on GitHub <#3896 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHCGIBK7XL4H2QMTFZ7ZIDLZ5AGWDAVCNFSM6AAAAABQNT2OQWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMZTGQ3TMOJXGI . You are receiving this because you were mentioned.Message ID: @.>

My current application is something like this:

sys_msg = "something"
init_msg = "something"
af = AgentFactory()
assistant = af.create_assistant_agent(llm_config)
user_proxy = af.create_user_proxy_agent()
assistant.update_system_message(sys_msg)
c= ChatInitiator(user_proxy, assistant)
output = c.initiate_chat(init_msg)

Now, whenever a user enters RESET, the LLM should forget everything except for the sys_msg and the init_msg, i.e. it should forget everything that the user sent and it remembered.
E.g.
User: What is the capital of France?
LLM: Paris
User: What is the capital of Germany?
LLM: Berlin
User: Which countries did we talk about previously?
LLM: France and Germany.
User: RESET
LLM: ...
User: Which countries did we talk about previously?
LLM: Did we even talk about countries?

@ekzhu
Copy link
Collaborator

ekzhu commented Oct 24, 2024

In v0.2, you want to use the register_reply method to add a custom handler that watches out for the RESET keyword and reset the message history.

See: https://microsoft.github.io/autogen/0.2/docs/reference/agentchat/conversable_agent#register_reply

Inside the registered reply function, you want to clear the list self._oai_messages[sender]

@Kunjal1999
Copy link
Author

Kunjal1999 commented Oct 24, 2024

    @staticmethod
    def create_user_proxy_agent() -> autogen.UserProxyAgent:
        user_proxy = autogen.UserProxyAgent(
            name="user_proxy",
            human_input_mode="TERMINATE",
            max_consecutive_auto_reply=1,
            is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
            code_execution_config={
                "executor": LocalCommandLineCodeExecutor(work_dir="coding"),
            },
        )
        def reset_reply_func(
            recipient: autogen.UserProxyAgent,
            messages: Optional[List[Dict]] = None,
            sender: Optional[autogen.Agent] = None,
            config: Optional[Any] = None,
        ) -> Tuple[bool, Union[str, Dict, None]]:
            if messages and messages[-1].get("content", "").strip().upper() == "RESET":
                recipient._oai_messages[sender] = []
                return True, "Memory has been reset."
            return False, None

        user_proxy.register_reply(
            trigger="user_proxy",  
            reply_func=reset_reply_func,  
            position=0,  # Highest priority
            remove_other_reply_funcs=False  
        )
        return user_proxy

Hello, I tried implementing the register_reply function with self._oai_messages[sender] but it didn't work out. Kindly help.

@ekzhu
Copy link
Collaborator

ekzhu commented Oct 24, 2024

You need to register it to the other (assistant) agent not the user proxy.

@Kunjal1999
Copy link
Author

Hello,
I tried registering it to the AssistantAgent and it is still not functioning in the expected way. Could you please show how to capture the RESET input sent by the user?

@ekzhu
Copy link
Collaborator

ekzhu commented Oct 25, 2024

Can you update the trigger field to the actual instance not the string?

Your code is almost there.

@Kunjal1999
Copy link
Author

Kunjal1999 commented Oct 25, 2024

Hello,
Thanks for the suggestion!
The issue with using a register_reply function is that even though we can delete the messages history, it still sends the output once since the reply function is probably called AFTER the assistantagent sends it's reply to the userproxy (after receiving RESET).
What is the difference between chat_messages and _oai_messages?
Besides chat_messages, _oai_messages and _human_input, do I need to clear anything else manually too?

@Kunjal1999
Copy link
Author

Also, in a user_proxy, do we have any list of pointers to all the agents the user_proxy has been so far interacting with?
Currently a workaround I see is:
self.reply_at_receive.keys()

@ekzhu
Copy link
Collaborator

ekzhu commented Oct 28, 2024

@Kunjal1999 Beyond register reply function you also use register_hook and register a custom method to process_all_messages transform the messages into an empty list if RESET is detected in the last message.

You can also use message transform feature to do the same thing, which transform the messages into an empty list if RESET is detected in the last message.

Generally, I can see v0.4 Core API might suite you better if you are looking for more control over how messages are handled. You can take a look at the two agent chat example in v0.4 Core preview: https://microsoft.github.io/autogen/dev/user-guide/core-user-guide/quickstart.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants