-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix example for LLM configuration docs #1528
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1528 +/- ##
=======================================
Coverage 34.26% 34.26%
=======================================
Files 42 42
Lines 5102 5102
Branches 1167 1167
=======================================
Hits 1748 1748
Misses 3210 3210
Partials 144 144
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great catch!
There is another issue perhaps related to this PR. When I run the following command outside of the project repo direcotry:
It is complaining about the import issue. Could be related to #1526 |
This can happen when you have an "autogen" directory in the current path. If you import it from other places it works. |
* update the simplest llm config example * formatting --------- Co-authored-by: Chi Wang <[email protected]>
Why are these changes needed?
The current simplest example does not work as the OpenAI client complains that "model" is not defined in the request.
Related issue number
Checks