Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Additional kwargs key total_tokens already exists in left dict and value has unsupported type <class 'decimal.Decimal'>. #620

Open
azaylamba opened this issue Jan 6, 2025 · 6 comments

Comments

@azaylamba
Copy link
Contributor

azaylamba commented Jan 6, 2025

Getting the above error in case of long conversations while using RAG flow. The issue could be on langchain side as well but not sure.
I am using anthropic.claude-3-5-sonnet-20240620-v1:0 model
Following are the versions of langchain libraries:

langchain==0.3.13
langchain-core==0.3.28
langchain-community==0.3.13
@azaylamba
Copy link
Contributor Author

Below is the stack trace:

"stack_trace": {
     "type": "TypeError",
     "value": "Additional kwargs key total_tokens already exists in left dict and value has unsupported type <class 'decimal.Decimal'>.",
     "module": "builtins",
     "frames": [
         {
             "file": "/var/task/adapters/base/base.py",
             "line": 183,
             "function": "run_with_chain_v2",
             "statement": "for chunk in conversation.stream("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 5525,
             "function": "stream",
             "statement": "yield from self.bound.stream("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 5525,
             "function": "stream",
             "statement": "yield from self.bound.stream("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3407,
             "function": "stream",
             "statement": "yield from self.transform(iter([input]), config, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3394,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2197,
             "function": "_transform_stream_with_config",
             "statement": "chunk: Output = context.run(next, iterator)  # type: ignore"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3357,
             "function": "_transform",
             "statement": "yield from final_pipeline"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 5561,
             "function": "transform",
             "statement": "yield from self.bound.transform("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 4820,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2197,
             "function": "_transform_stream_with_config",
             "statement": "chunk: Output = context.run(next, iterator)  # type: ignore"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 4800,
             "function": "_transform",
             "statement": "for chunk in output.stream("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 5525,
             "function": "stream",
             "statement": "yield from self.bound.stream("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3407,
             "function": "stream",
             "statement": "yield from self.transform(iter([input]), config, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3394,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2197,
             "function": "_transform_stream_with_config",
             "statement": "chunk: Output = context.run(next, iterator)  # type: ignore"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3357,
             "function": "_transform",
             "statement": "yield from final_pipeline"
         },
         {
             "file": "/opt/python/langchain_core/runnables/passthrough.py",
             "line": 576,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2197,
             "function": "_transform_stream_with_config",
             "statement": "chunk: Output = context.run(next, iterator)  # type: ignore"
         },
         {
             "file": "/opt/python/langchain_core/runnables/passthrough.py",
             "line": 555,
             "function": "_transform",
             "statement": "for chunk in for_passthrough:"
         },
         {
             "file": "/opt/python/langchain_core/utils/iter.py",
             "line": 61,
             "function": "tee_peer",
             "statement": "item = next(iterator)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/passthrough.py",
             "line": 576,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2197,
             "function": "_transform_stream_with_config",
             "statement": "chunk: Output = context.run(next, iterator)  # type: ignore"
         },
         {
             "file": "/opt/python/langchain_core/runnables/passthrough.py",
             "line": 566,
             "function": "_transform",
             "statement": "yield cast(dict[str, Any], first_map_chunk_future.result())"
         },
         {
             "file": "/var/lang/lib/python3.11/concurrent/futures/_base.py",
             "line": 456,
             "function": "result",
             "statement": "return self.__get_result()"
         },
         {
             "file": "/var/lang/lib/python3.11/concurrent/futures/_base.py",
             "line": 401,
             "function": "__get_result",
             "statement": "raise self._exception"
         },
         {
             "file": "/var/lang/lib/python3.11/concurrent/futures/thread.py",
             "line": 58,
             "function": "run",
             "statement": "result = self.fn(*self.args, **self.kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3847,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2197,
             "function": "_transform_stream_with_config",
             "statement": "chunk: Output = context.run(next, iterator)  # type: ignore"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3832,
             "function": "_transform",
             "statement": "chunk = AddableDict({step_name: future.result()})"
         },
         {
             "file": "/var/lang/lib/python3.11/concurrent/futures/_base.py",
             "line": 449,
             "function": "result",
             "statement": "return self.__get_result()"
         },
         {
             "file": "/var/lang/lib/python3.11/concurrent/futures/_base.py",
             "line": 401,
             "function": "__get_result",
             "statement": "raise self._exception"
         },
         {
             "file": "/var/lang/lib/python3.11/concurrent/futures/thread.py",
             "line": 58,
             "function": "run",
             "statement": "result = self.fn(*self.args, **self.kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 5561,
             "function": "transform",
             "statement": "yield from self.bound.transform("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 1431,
             "function": "transform",
             "statement": "yield from self.stream(final, config, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/branch.py",
             "line": 367,
             "function": "stream",
             "statement": "for chunk in self.default.stream("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3407,
             "function": "stream",
             "statement": "yield from self.transform(iter([input]), config, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3394,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2197,
             "function": "_transform_stream_with_config",
             "statement": "chunk: Output = context.run(next, iterator)  # type: ignore"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 3357,
             "function": "_transform",
             "statement": "yield from final_pipeline"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 1413,
             "function": "transform",
             "statement": "for ichunk in input:"
         },
         {
             "file": "/opt/python/langchain_core/output_parsers/transform.py",
             "line": 64,
             "function": "transform",
             "statement": "yield from self._transform_stream_with_config("
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 2161,
             "function": "_transform_stream_with_config",
             "statement": "final_input: Optional[Input] = next(input_for_tracing, None)"
         },
         {
             "file": "/opt/python/langchain_core/runnables/base.py",
             "line": 1431,
             "function": "transform",
             "statement": "yield from self.stream(final, config, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/language_models/chat_models.py",
             "line": 365,
             "function": "stream",
             "statement": "BaseMessageChunk, self.invoke(input, config=config, stop=stop, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/language_models/chat_models.py",
             "line": 286,
             "function": "invoke",
             "statement": "self.generate_prompt("
         },
         {
             "file": "/opt/python/langchain_core/language_models/chat_models.py",
             "line": 786,
             "function": "generate_prompt",
             "statement": "return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/language_models/chat_models.py",
             "line": 643,
             "function": "generate",
             "statement": "raise e"
         },
         {
             "file": "/opt/python/langchain_core/language_models/chat_models.py",
             "line": 633,
             "function": "generate",
             "statement": "self._generate_with_cache("
         },
         {
             "file": "/opt/python/langchain_core/language_models/chat_models.py",
             "line": 851,
             "function": "_generate_with_cache",
             "statement": "result = self._generate("
         },
         {
             "file": "/opt/python/langchain_aws/chat_models/bedrock_converse.py",
             "line": 491,
             "function": "_generate",
             "statement": "bedrock_messages, system = _messages_to_bedrock(messages)"
         },
         {
             "file": "/opt/python/langchain_aws/chat_models/bedrock_converse.py",
             "line": 685,
             "function": "_messages_to_bedrock",
             "statement": "messages = merge_message_runs(messages)"
         },
         {
             "file": "/opt/python/langchain_core/messages/utils.py",
             "line": 381,
             "function": "wrapped",
             "statement": "return func(messages, **kwargs)"
         },
         {
             "file": "/opt/python/langchain_core/messages/utils.py",
             "line": 571,
             "function": "merge_message_runs",
             "statement": "merged.append(_chunk_to_msg(last_chunk + curr_chunk))"
         },
         {
             "file": "/opt/python/langchain_core/messages/ai.py",
             "line": 395,
             "function": "__add__",
             "statement": "return add_ai_message_chunks(self, other)"
         },
         {
             "file": "/opt/python/langchain_core/messages/ai.py",
             "line": 412,
             "function": "add_ai_message_chunks",
             "statement": "additional_kwargs = merge_dicts("
         },
         {
             "file": "/opt/python/langchain_core/utils/_merge.py",
             "line": 58,
             "function": "merge_dicts",
             "statement": "merged[right_k] = merge_dicts(merged[right_k], right_v)"
         },
         {
             "file": "/opt/python/langchain_core/utils/_merge.py",
             "line": 68,
             "function": "merge_dicts",
             "statement": "raise TypeError(msg)"
         }
     ]
 }

@azaylamba
Copy link
Contributor Author

I have also raised the issue on langchain repo langchain-ai/langchain#29042

@azaylamba
Copy link
Contributor Author

The usage data in the metadata for the last successful message is as below:

usage:{
total_tokens:14853,
input_tokens:13981,
output_tokens:872
},

@charles-marion
Copy link
Collaborator

Hi @azaylamba ,

Are you able to reproduce it consistently and/or with other models?
I sent several messages and was not able to reproduce. (stopped at 20k tokens)

Note, there is a known issue with long RAG conversations were saving large history causes issues.
#580
(If related to the above, you could stop saving the metadata to the history to reduce the size:

)

@azaylamba
Copy link
Contributor Author

azaylamba commented Jan 7, 2025

Hi @charles-marion , this is reproducible most of the times but not always. But once occurred, it gets reproduced consistently in that particular session.

I have not tried with other models yet.
One thing I noticed is, I recently changed the system prompt to make that more structured and as a result, it became lengthy (12k characters now as compared to 6k characters earlier). And I have started to encounter this issue after this change.

I will try the above mentioned change to see if that solves the problem.

@azaylamba
Copy link
Contributor Author

Not encountering the issue after removing "usage" key from metadata.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

2 participants