-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix replace_sampler
missing the batch size under specific conditions
#9367
Fix replace_sampler
missing the batch size under specific conditions
#9367
Conversation
Codecov Report
@@ Coverage Diff @@
## master #9367 +/- ##
=======================================
- Coverage 92% 88% -4%
=======================================
Files 178 178
Lines 14899 14902 +3
=======================================
- Hits 13752 13142 -610
- Misses 1147 1760 +613 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Can you add a simple test with a dummy loader?
Something like
class DummyLoader(torch.utils.data.DataLoader):
def __init__(self, **kwargs):
super().__init__(**kwargs)
May be mistaken, but the
|
@SeanNaren my bad, overseen this |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
What does this PR do?
Reported in Slack.
Fixes a bug where the
batch_size
was not properly reset under the following conditions:batch_size
is provided to theDataLoader
batch_sampler
is present initiallyDataLoader
signature passes thebatch_size
through**kwargs
.Does your PR introduce any breaking changes? If yes, please list them.
None
Before submitting
PR review