-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix self.device access in DataParallel #6414
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
awaelchli
added
bug
Something isn't working
strategy: dp (removed in pl)
DataParallel
labels
Mar 8, 2021
Codecov Report
@@ Coverage Diff @@
## master #6414 +/- ##
=======================================
- Coverage 92% 87% -5%
=======================================
Files 194 194
Lines 12355 12365 +10
=======================================
- Hits 11339 10696 -643
- Misses 1016 1669 +653 |
awaelchli
requested review from
Borda,
carmocca,
justusschock,
kaushikb11,
SeanNaren,
tchaton and
williamFalcon
as code owners
April 11, 2021 03:58
awaelchli
force-pushed
the
bugfix/dp-self-device
branch
from
April 11, 2021 04:12
53378ca
to
f4281c0
Compare
justusschock
approved these changes
Apr 11, 2021
tchaton
approved these changes
Apr 12, 2021
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really Neat !
carmocca
approved these changes
Apr 12, 2021
ananthsub
approved these changes
Apr 12, 2021
Closed
SeanNaren
pushed a commit
that referenced
this pull request
Apr 13, 2021
(cherry picked from commit 80c5293)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What does this PR do?
DataParallel does not maintain state during the forward on the replicas. Therefore, self.device in e.g. training_step points to the root device no matter on which gpu.
We can fix this problem by taking the device from the input tensors, these are guaranteed to be on the right device.
The only exception is when the model doesn't get a batch with tensors in it. This is rare, and the best we can do is warn the user that self.device will not work.
Fixes #6413
Fixes #6563
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃