Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Retry remaining batch of prompts on failure #433

Closed
yadavsahil197 opened this issue Jul 6, 2023 · 1 comment
Closed

[Bug]: Retry remaining batch of prompts on failure #433

yadavsahil197 opened this issue Jul 6, 2023 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@yadavsahil197
Copy link
Contributor

Describe the bug
The labeling agent sends CHUNK_SIZE no of prompts for labeling to the LLM at a given time. If the LLM fails to label some of the prompts, the entire chunk of prompts are left unlabeled.

To Reproduce
A list of steps to reproduce the behavior:
Use an LLM with smaller context window and create a prompt example with large enough tokens, all prompts in the batch fail even if some of them have tokens within the context window of the model.

Expected behavior
The labeling agent should fail only the actual failure prompt and retry for the remaining prompts in the batch.

@yadavsahil197 yadavsahil197 added the bug Something isn't working label Jul 6, 2023
@Tyrest Tyrest self-assigned this Jul 10, 2023
@Abhinav-Naikawadi
Copy link
Contributor

@Tyrest we can close this right? Done here: #443

@Tyrest Tyrest closed this as completed Jul 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Done
Development

No branches or pull requests

3 participants