-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add support for Anthropic Claude function call #2311
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like it I see the power in CustomModelClients my pr is currently stuck due to a bug in litellm , but this shows a potential way around litellm use.
My only suggestion is to make sure to unit test the conversion functions , as the are static it should be easy. awesome work!
This is a good notebook, thanks @levscaut... I ran it for my location (Sydney, Australia) and it responded:
The function call seems to work, which is great! I'm wondering about the lines:
Should it be more like?
When I ran it with the default 'Toronto'...
... it threw an exception: I take it that it decided to return code rather than prepare the message for a function call. I'm not sure why it's returning a response and then crashing. |
Thanks for testing this. I managed to reproduce this issue, and it's due to the Claude API throwing an error when user input is empty like As for that function, it's just a simple example of a function that has all the description attributes(and my little complainment about Toronto being cloudy during the solar eclipse). I've remove all the condition to avoid any misunderstanding on this. |
Thanks for the suggestion. I'd like to add a unit test, but it seems very hard to import something that is defined in a notebook. So I think maybe when |
I see - yep, that's a challenge I have as well with the function calling using LiteLLM+Ollama, it won't terminate naturally. If anyone can assist with how to handle both function and non-function calls with the same LLM - OpenAI's models can handle going between function calling and normal messages, but I haven't been able to do that with non-OpenAI models. Thanks for the explanation of the function - sorry about the bad weather, but at least you got the solar eclipse! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the notebook!
* add support for function call * clear pip install output * add convert function from `tools` to `functions` * fix empty user input error(temporary)
Why are these changes needed?
Anthropic just announced that tool use is now in public beta in the Anthropic API. You can find more documentation about Claude's function call feature here.
As the Claude model become more complete, we might consider migrating it to the core client module. The code within this notebook is becoming heavier and potentially too complex to manage efficiently.
Related issue number
Checks