Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow for get_human_input to be async #527

Closed
constantinidan opened this issue Nov 2, 2023 · 7 comments
Closed

Allow for get_human_input to be async #527

constantinidan opened this issue Nov 2, 2023 · 7 comments

Comments

@constantinidan
Copy link

Hi!

Ideally the get_human_input should also have the ability to be async. Right now, it seems impossible to have a code that is fully async because of this.

To do that, there is a need to async the function check_termination_and_human_reply and await the three calls to get_human_input

Thanks

@victordibia victordibia added the dev label Nov 3, 2023
@sonichi
Copy link
Contributor

sonichi commented Nov 4, 2023

This feature is added in v0.2.0b1. Could you test it?

@my3sons
Copy link

my3sons commented Feb 24, 2024

Hello, just curious if anyone ever had complete success with this? I was recently trying to go all async with a chainlit/autogen implementation but ran into this issue. I tried running with v0.2.0b1 and I got a bit further downstream but then ran into the below. NOTE: the code I am currently playing with and what generated the trace below is coming from the chanlit-autogen cookbook: https://github.com/Chainlit/cookbook/tree/main/pyautogen

Traceback (most recent call last):
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/chainlit/utils.py", line 39, in wrapper
    return await user_function(**params_values)
  File "main.py", line 122, in on_chat_start
    await user_proxy.a_initiate_chat(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 577, in a_initiate_chat
    await self.a_send(self.generate_init_message(**context), recipient, silent=silent)
  File "main.py", line 94, in a_send
    await super(ChainlitUserProxyAgent, self).a_send(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 403, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 523, in a_receive
    await self.a_send(reply, sender, silent=silent)
  File "main.py", line 41, in a_send
    await super(ChainlitAssistantAgent, self).a_send(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 403, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 523, in a_receive
    await self.a_send(reply, sender, silent=silent)
  File "main.py", line 94, in a_send
    await super(ChainlitUserProxyAgent, self).a_send(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 401, in a_send
    valid = self._append_oai_message(message, "assistant", recipient)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 295, in _append_oai_message
    message = self._message_to_dict(message)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 277, in _message_to_dict
    return dict(message)
TypeError: 'coroutine' object is not iterable

@sonichi
Copy link
Contributor

sonichi commented Feb 25, 2024

Could you try the latest version? v0.2.15.

@my3sons
Copy link

my3sons commented Feb 26, 2024

Hello @sonichi, I upgraded to v0.2.15 and again using the autogen example from the chainlit cookbook with human_input_mode="ALWAYS", I am running into this the first time the user_proxy asks for feedback. If I simply return and go with auto_reply, the following message gets created:

{'role': 'user', 'content': <coroutine object ChainlitUserProxyAgent.get_human_input at 0x10b975cc0>}

That message ultimately ends up in autogen/code_utils and fails at line 65 because of the coroutine type:

if content is None:
    return ""
if isinstance(content, str):
    return content
if not isinstance(content, list):
    raise TypeError(f"content must be None, str, or list, but got {type(content)}")

Here is the complete stack trace:

Traceback (most recent call last):
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/chainlit/utils.py", line 39, in wrapper
    return await user_function(**params_values)
  File "main.py", line 135, in on_chat_start
    await user_proxy.a_initiate_chat(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 894, in a_initiate_chat
    await self.a_send(await self.a_generate_init_message(**context), recipient, silent=silent)
  File "main.py", line 103, in a_send
    await super(ChainlitUserProxyAgent, self).a_send(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 583, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 731, in a_receive
    await self.a_send(reply, sender, silent=silent)
  File "main.py", line 45, in a_send
    await super(ChainlitAssistantAgent, self).a_send(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 583, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 731, in a_receive
    await self.a_send(reply, sender, silent=silent)
  File "main.py", line 103, in a_send
    await super(ChainlitUserProxyAgent, self).a_send(
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 583, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 726, in a_receive
    self._process_received_message(message, sender, silent)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 657, in _process_received_message
    self._print_received_message(message, sender)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 619, in _print_received_message
    print(content_str(content), flush=True)
  File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/code_utils.py", line 65, in content_str
    raise TypeError(f"content must be None, str, or list, but got {type(content)}")
TypeError: content must be None, str, or list, but got <class 'coroutine'>

I am continuing to look into this but if anything comes to mind for you, please let me know. Thanks!

@my3sons
Copy link

my3sons commented Feb 28, 2024

Hello @sonichi, I finally found some time to look into this further and the issue is solely with the chainlit cookbook code and autogen async is working perfectly. I will open a PR on the chainlit side to resolve the issue. Thanks for the work you do with autogen, it is truly exciting to work with these capabilities!

@julianakiseleva
Copy link
Contributor

@my3sons @constantinidan can we close this issue?

@my3sons
Copy link

my3sons commented Feb 29, 2024

Hello @julianakiseleva , from my perspective, this issue can be closed.

jackgerrits pushed a commit that referenced this issue Oct 2, 2024
* starting #520 - readme improvements

* more for #520

* tiny announcement

* cleanupheader

* simplifying

* move announcement to top

* rearrange logo

* we dont need the logo

* bumping the date

* added faq addressing #500
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants