You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to let you know that we just merged a small change to the ChatCompletionClient interface microsoft/autogen#4856
The effect on your lib is minimal. To update your code to match the new expected parameter to the OpenAIChatCompletionClient you should update the following line:
Sorry missed this issue before archiving: I noticed that your PR provides support to pass all the OpenAI-LLM-call arguments directly through the ModelFamily and base_url args - which kind of makes this repo redundant (I could add a few classes that inherit the 0.4.dev13 OpenAIClient version that support all the other popular LLMs out there which could be useful to others to minimize efforts), but seems like largely your PR removes the necessity of this repo. Curious if you think the same
Personally, I think there is value in what you're doing here. That is maintaining model info and providing easy to use wrappers for people choosing specific providers.
I wanted to let you know that we just merged a small change to the
ChatCompletionClient
interface microsoft/autogen#4856The effect on your lib is minimal. To update your code to match the new expected parameter to the OpenAIChatCompletionClient you should update the following line:
autogen-openaiext-client/autogen_openaiext_client/client.py
Line 82 in ba307ae
to:
Then pass this value to the
model_info
param of the parent.Let me know if you have any questions!
The text was updated successfully, but these errors were encountered: