-
Notifications
You must be signed in to change notification settings - Fork 16.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
anthropic[major]: support python 3.13 #27916
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
Integration tests pass with 3.13 https://github.com/langchain-ai/langchain/actions/runs/11688847362 |
if not self.count_tokens: | ||
raise NameError("Please ensure the anthropic package is loaded") | ||
return self.count_tokens(text) | ||
raise NotImplementedError( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to make this nonbreaking could we just call the new endpoint with a single humanmessage and subtract a constant number of standard tokens. (for e.g. Human:
, and print a warning?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Older models like claude 2 not supported unfortunately: https://docs.anthropic.com/en/docs/build-with-claude/token-counting#supported-models
@@ -1113,6 +1113,40 @@ class AnswerWithJustification(BaseModel): | |||
else: | |||
return llm | output_parser | |||
|
|||
@beta() | |||
def get_num_tokens_from_messages(self, messages: List[BaseMessage]) -> int: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thoughts on updating base implementation to take tools and/or kwargs? definitely not an anthropic-specific problem
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
id support
Co-authored-by: Bagatur <[email protected]>
Last week Anthropic released version 0.39.0 of its python sdk, which enabled support for Python 3.13. This release deleted a legacy
client.count_tokens
method, which we currently access during init of theAnthropic
LLM. Anthropic has replaced this functionality with the client.beta.messages.count_tokens() API.To enable support for
anthropic >= 0.39.0
and Python 3.13, here we drop support for the legacy token counting method, and add support for the new method viaChatAnthropic.get_num_tokens_from_messages
.To fully support the token counting API, we update the signature of
get_num_tokens_from_message
to accept tools everywhere.