-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Disable memory sharing on model parameters in ddp-spawn #18238
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
awaelchli
added
bug
Something isn't working
strategy: ddp spawn
fun
Staff contributions outside working hours - to differentiate from the "community" label
labels
Aug 6, 2023
github-actions
bot
added
fabric
lightning.fabric.Fabric
pl
Generic label for PyTorch Lightning package
labels
Aug 6, 2023
for more information, see https://pre-commit.ci
…to bugfix/tensor-memory-sharing
for more information, see https://pre-commit.ci
awaelchli
commented
Aug 8, 2023
awaelchli
force-pushed
the
bugfix/tensor-memory-sharing
branch
from
August 8, 2023 22:22
a3733be
to
d10e1c4
Compare
awaelchli
requested review from
carmocca,
justusschock and
williamFalcon
as code owners
August 9, 2023 21:44
awaelchli
changed the title
Disable memory sharing on model parameters in ddp-spawn [TPU]
WIP: Disable memory sharing on model parameters in ddp-spawn [TPU]
Aug 10, 2023
awaelchli
changed the title
WIP: Disable memory sharing on model parameters in ddp-spawn [TPU]
WIP: Disable memory sharing on model parameters in ddp-spawn
Aug 13, 2023
awaelchli
changed the title
WIP: Disable memory sharing on model parameters in ddp-spawn
Disable memory sharing on model parameters in ddp-spawn
Aug 13, 2023
awaelchli
commented
Aug 13, 2023
for more information, see https://pre-commit.ci
carmocca
approved these changes
Aug 14, 2023
Co-authored-by: Carlos Mocholí <[email protected]>
Borda
approved these changes
Aug 15, 2023
Borda
pushed a commit
that referenced
this pull request
Aug 28, 2023
Co-authored-by: Carlos Mocholí <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> (cherry picked from commit a0ca2c8)
lantiga
pushed a commit
that referenced
this pull request
Aug 30, 2023
Co-authored-by: Carlos Mocholí <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> (cherry picked from commit a0ca2c8)
awaelchli
added
strategy: ddp
DistributedDataParallel
and removed
strategy: ddp spawn
labels
Nov 4, 2023
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
bug
Something isn't working
fabric
lightning.fabric.Fabric
fun
Staff contributions outside working hours - to differentiate from the "community" label
pl
Generic label for PyTorch Lightning package
ready
PRs ready to be merged
reproducibility
strategy: ddp
DistributedDataParallel
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Fixes #17399
The
torch.multiprocessing.spawn
launcher (strategy="ddp_spawn") by default enables memory sharing for all tensors passed through the spawning function, including tensors in modules. This means that the underlying storage of these tensors is shared across all processes, and anyone can read or write to them. This can lead to inconsistencies, as demonstrated in the repro example in the linked issue. This PR disables that by cloning the weights in each process, detaching it from shared memory.Note:
This only applies when running on CPU. If running on GPU, memory won't be shared.
Hopefully, this will also help to avoid flakiness in tests.
cc @Borda @carmocca @justusschock @awaelchli