-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom Client support #831
Conversation
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #831 +/- ##
===========================================
+ Coverage 26.54% 38.12% +11.57%
===========================================
Files 28 28
Lines 3805 3864 +59
Branches 865 917 +52
===========================================
+ Hits 1010 1473 +463
+ Misses 2724 2263 -461
- Partials 71 128 +57
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Suggestion: use pre-commit to format the code. |
I tried to merge the conflicts but don't have access to the repo. @olgavrou can merge the conflicts? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Customizable client adds a lot of value. Regarding this PR I think it moves the code to the right direction. A couple of things to consider in a broader context:
- The current PR addresses the client API. Ideally, we want to bind message format API call parameters with the client as well. See [Bug]: name Field Compatibility Issue in AssistantAgent and UserProxyAgent with fireworks.ai API #1129 for some issues with alternative message format.
- How should we approach LLM serving tools that aims for OpenAI compatibility, like vllm and litellm. I think a customizable client layer in our code is still needed as we may need to move faster than dependencies.
Lastly, the PR does a lot of code moving so need to make sure the latest changes in the main branch were copied to the new destination.
closing in favour of #1345 |
Why are these changes needed?
This PR adds support for custom client calls to be made. The idea is that the user can specify their own custom client class and load it into the configuration (e.g. for loading local models). As long as they adhere to the
Client
class's interface and response protocol then everything should work fine with only this addition to the config (full usage shown in the unit test):The PR looks big but it is mostly moving code around. If this is something that we want then I can go ahead and add docs and a notebook
Ideally the client interface would not live under a directory called
oai
and the client wrapper wouldn't be calledOpenAIWrapper
but something more generic, but that can be examined at a later dateRelated issue number
Checks