Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix example for LLM configuration docs #1528

Merged
merged 3 commits into from
Feb 4, 2024

Conversation

jackgerrits
Copy link
Member

Why are these changes needed?

The current simplest example does not work as the OpenAI client complains that "model" is not defined in the request.

Related issue number

Checks

@jackgerrits jackgerrits requested a review from sonichi February 4, 2024 00:30
@jackgerrits jackgerrits changed the title Fix example for LLM configuration Fix example for LLM configuration docs Feb 4, 2024
@codecov-commenter
Copy link

codecov-commenter commented Feb 4, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (1abbcf3) 34.26% compared to head (79d5037) 34.26%.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #1528   +/-   ##
=======================================
  Coverage   34.26%   34.26%           
=======================================
  Files          42       42           
  Lines        5102     5102           
  Branches     1167     1167           
=======================================
  Hits         1748     1748           
  Misses       3210     3210           
  Partials      144      144           
Flag Coverage Δ
unittests 34.26% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@sonichi sonichi requested review from afourney, AaronWard and a team February 4, 2024 15:44
Copy link
Collaborator

@AaronWard AaronWard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍

Copy link
Collaborator

@ekzhu ekzhu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great catch!

@ekzhu
Copy link
Collaborator

ekzhu commented Feb 4, 2024

There is another issue perhaps related to this PR. When I run the following command outside of the project repo direcotry:

>>> import autogen
>>> config_list = autogen.get_config_list(["test"], api_type = "openai")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: module 'autogen' has no attribute 'get_config_list'

It is complaining about the import issue. Could be related to #1526

@sonichi
Copy link
Contributor

sonichi commented Feb 4, 2024

There is another issue perhaps related to this PR. When I run the following command outside of the project repo direcotry:

>>> import autogen
>>> config_list = autogen.get_config_list(["test"], api_type = "openai")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: module 'autogen' has no attribute 'get_config_list'

It is complaining about the import issue. Could be related to #1526

This can happen when you have an "autogen" directory in the current path. If you import it from other places it works.

@sonichi sonichi enabled auto-merge February 4, 2024 19:17
@sonichi sonichi added this pull request to the merge queue Feb 4, 2024
Merged via the queue into microsoft:main with commit 3715a74 Feb 4, 2024
19 checks passed
whiskyboy pushed a commit to whiskyboy/autogen that referenced this pull request Apr 17, 2024
* update the simplest llm config example

* formatting

---------

Co-authored-by: Chi Wang <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants