-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error handling in getting LLM-based summary #1567
Conversation
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #1567 +/- ##
==========================================
+ Coverage 35.28% 42.33% +7.04%
==========================================
Files 44 44
Lines 5311 5315 +4
Branches 1231 1300 +69
==========================================
+ Hits 1874 2250 +376
+ Misses 3283 2863 -420
- Partials 154 202 +48
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
GroupChat has/had the same problem and resolved it this way: autogen/autogen/agentchat/groupchat.py Lines 346 to 354 in 3e33a2c
|
* summary exception * badrequest error * test * skip reason * error
Why are these changes needed?
The llm-based summary method (introduced in #1402) is not robust. Handling errors so that the code fails gracefully.
A quick fix to some of the known failure cases is added at #1564. But it is not necessarily sufficient and it may take some time to sort out a good one.
Related issue number
Checks