Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: TypeError: unsupported operand type(s) for +: 'int' and 'NoneType' #984

Closed
ragesh2000 opened this issue Dec 14, 2023 · 13 comments · Fixed by #1008
Closed

[Bug]: TypeError: unsupported operand type(s) for +: 'int' and 'NoneType' #984

ragesh2000 opened this issue Dec 14, 2023 · 13 comments · Fixed by #1008
Assignees
Labels
group chat/teams group-chat-related issues

Comments

@ragesh2000
Copy link

ragesh2000 commented Dec 14, 2023

Describe the bug

Got an error in autogen groupchat when i set human_input_mode == 'NEVER'

 File "/home/gpu/ai/llm/autogen/autogen_inference.py", line 47, in <module>
    res = user_proxy.initiate_chat(manager, message=msg)
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 550, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 348, in send
    recipient.receive(message, self, request_reply, silent)
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 481, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 940, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/agentchat/groupchat.py", line 291, in run_chat
    speaker = groupchat.select_speaker(speaker, self)
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/agentchat/groupchat.py", line 168, in select_speaker
    final, name = selector.generate_oai_reply(
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 625, in generate_oai_reply
    response = client.create(
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/oai/client.py", line 262, in create
    self._update_usage_summary(response, use_cache=False)
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/oai/client.py", line 362, in _update_usage_summary
    self.total_usage_summary = update_usage(self.total_usage_summary)
  File "/home/gpu/miniconda3/envs/autogen/lib/python3.10/site-packages/autogen/oai/client.py", line 355, in update_usage
    "completion_tokens": usage_summary.get(response.model, {}).get("completion_tokens", 0)
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'

Steps to reproduce

No response

Expected Behavior

No response

Screenshots and logs

No response

Additional Information

No response

Tasks

Preview Give feedback
No tasks being tracked yet.
@ragesh2000 ragesh2000 added the bug label Dec 14, 2023
@afourney afourney added the group chat/teams group-chat-related issues label Dec 14, 2023
@afourney
Copy link
Member

@ragesh2000 what version of AutoGen are you using?

@ragesh2000
Copy link
Author

pyautogen==0.2.1 is the version iam using @afourney

@mongolu
Copy link

mongolu commented Dec 15, 2023

+1 here.
0.2.2 pyautogen

@yiranwu0
Copy link
Collaborator

@ragesh2000 @mongolu Any code that I could replicate the error?

@mongolu
Copy link

mongolu commented Dec 16, 2023

In file autogen/oai/client.py, I've replace line 365 with this
"completion_tokens": usage_summary.get(response.model, {}).get("completion_tokens", 0) if usage_summary.get(response.model, {}).get("completion_tokens", 0) is not None else 0

I'm not confident i cant help with the error replication.
I can confirm though that I bumped into this error multiple times with multiple local LLM.
I am running autogen in ollama docker container, with litellm.

If I can help more, please guide me to give you what info you require.

@ragesh2000
Copy link
Author

ragesh2000 commented Dec 16, 2023

I can also provide my code

import autogen

config_list_llama = [-
    {
        'base_url': "http://0.0.0.0:8000",
        'api_key': "NULL"
    }
]

llm_config_llama = {
    "config_list": config_list_llama,
}

user_proxy = autogen.UserProxyAgent(
   name="User_proxy",
   system_message="A human admin.",
   human_input_mode="NEVER",
)

analyser = autogen.AssistantAgent(
    name="Data analyser",
    llm_config=llm_config_llama,
)
critic = autogen.AssistantAgent(
    name="Critic",
    llm_config=llm_config_llama,
)

groupchat = autogen.GroupChat(agents=[user_proxy, analyser, critic], messages=[], max_round=10)
manager = autogen.GroupChatManager(groupchat=groupchat, llm_config=llm_config_llama)
msg = """Download data from /data/inference_generation_sheet.csv and give me some valuable inferences.
      """
res = user_proxy.initiate_chat(manager, message=msg)

Iam also using autogen using ollama models and with litellm @kevin666aa

@yiranwu0
Copy link
Collaborator

Thanks! @ragesh2000 @mongolu Can you checkout #1008 and maybe copy paste the code to see if it works? I cannot replicate these errors quickly since I am not using local models.

@ragesh2000
Copy link
Author

Yes that solves my issue @kevin666aa

@ragesh2000
Copy link
Author

Also I have noticed that if the message to user_proxy.initiate_chat is same, the agent is not caring about the message and just returning the previous response. I think some kind of cache is getting involved in the response generation. How can i solve this issue ?

@mongolu
Copy link

mongolu commented Dec 18, 2023

Delete .cache dir

@mongolu
Copy link

mongolu commented Dec 18, 2023

Yes that solves my issue @kevin666aa

I also confirm that the problem is solved with this.

@yiranwu0
Copy link
Collaborator

Thanks! Will get it merged!

@ragesh2000
Copy link
Author

Thanks @kevin666aa

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
group chat/teams group-chat-related issues
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants