Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support loading distributed checkpoints for FSDP in Trainer #18358

Merged
merged 42 commits into from
Aug 23, 2023

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Aug 21, 2023

What does this PR do?

Part 2 (#18364) completes the support for sharded / distributed checkpointing with FSDP in Trainer. The FSDP user guide will be updated in a follow-up PR.

There are minor breaking changes (when using FSDP only):

  • The checkpoint gets loaded after configure_sharded_model(), that is, after the model is sharded and put on the GPU. This is to avoid expensively loading the checkpoint in CPU memory before sharding (the model may not fit!). The other strategy that also does this is DeepSpeed. This change is necessary now to support sharded checkpoints, but will also be needed for supporting meta device later (also for full-state checkpoints).
  • The FSDPStrategy.load_optimizer_state_dict method becomes a no-op.
  • The FSDPStrategy.load_model_state_dict method becomes a no-op.

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

cc @Borda @awaelchli @carmocca

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Aug 21, 2023
@awaelchli awaelchli changed the title Support sharded checkpoints in FSDP in Trainer Support distributed checkpoints for FSDP in Trainer Aug 21, 2023
@awaelchli awaelchli force-pushed the feature/dist-checkpoint-fsdp branch from 98b2528 to 1cdf15b Compare August 23, 2023 02:22
@awaelchli awaelchli added the strategy: fsdp Fully Sharded Data Parallel label Aug 23, 2023
@awaelchli awaelchli added this to the 2.1 milestone Aug 23, 2023
@awaelchli awaelchli added feature Is an improvement or enhancement checkpointing Related to checkpointing labels Aug 23, 2023
@awaelchli awaelchli marked this pull request as ready for review August 23, 2023 13:04
@github-actions
Copy link
Contributor

github-actions bot commented Aug 23, 2023

⚡ Required checks status: All passing 🟢

Groups summary

🟢 pytorch_lightning: Tests workflow
Check ID Status
pl-cpu (macOS-11, lightning, 3.8, 1.11) success
pl-cpu (macOS-11, lightning, 3.9, 1.12) success
pl-cpu (macOS-11, lightning, 3.10, 1.13) success
pl-cpu (macOS-11, lightning, 3.10, 2.0) success
pl-cpu (macOS-11, lightning, 3.8, 1.11, oldest) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11) success
pl-cpu (ubuntu-20.04, lightning, 3.9, 1.12) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 1.13) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 2.0) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11, oldest) success
pl-cpu (windows-2022, lightning, 3.8, 1.11) success
pl-cpu (windows-2022, lightning, 3.9, 1.12) success
pl-cpu (windows-2022, lightning, 3.10, 1.13) success
pl-cpu (windows-2022, lightning, 3.10, 2.0) success
pl-cpu (windows-2022, lightning, 3.8, 1.11, oldest) success
pl-cpu (macOS-11, pytorch, 3.8, 1.13) success
pl-cpu (ubuntu-20.04, pytorch, 3.8, 1.13) success
pl-cpu (windows-2022, pytorch, 3.8, 1.13) success

These checks are required after the changes to src/lightning/pytorch/strategies/deepspeed.py, src/lightning/pytorch/strategies/fsdp.py, tests/tests_pytorch/strategies/test_fsdp.py.

🟢 pytorch_lightning: Azure GPU
Check ID Status
[pytorch-lightning (GPUs) (testing Lightning latest)](https://dev.azure.com/Lightning-AI/72ab7ed8-b00f-4b6e-b131-3388f7ffafa7/_build/results?buildId=171160&view=logs&jobId=47e66f3c-897a-5428-da11-bf5c7745762e) success
[pytorch-lightning (GPUs) (testing PyTorch latest)](https://dev.azure.com/Lightning-AI/72ab7ed8-b00f-4b6e-b131-3388f7ffafa7/_build/results?buildId=171160&view=logs&jobId=3f274fac-2e11-54ca-487e-194c91f3ae9f) success

These checks are required after the changes to src/lightning/pytorch/strategies/deepspeed.py, src/lightning/pytorch/strategies/fsdp.py, tests/tests_pytorch/strategies/test_fsdp.py.

🟢 pytorch_lightning: Benchmarks
Check ID Status
lightning.Benchmarks success

These checks are required after the changes to src/lightning/pytorch/strategies/deepspeed.py, src/lightning/pytorch/strategies/fsdp.py.

🟢 pytorch_lightning: Docs
Check ID Status
docs-checks (pytorch, doctest) success
make-html (pytorch) success

These checks are required after the changes to src/lightning/pytorch/strategies/deepspeed.py, src/lightning/pytorch/strategies/fsdp.py.

🟢 mypy
Check ID Status
mypy success

These checks are required after the changes to src/lightning/pytorch/strategies/deepspeed.py, src/lightning/pytorch/strategies/fsdp.py.

🟢 install
Check ID Status
install-pkg (ubuntu-22.04, app, 3.8) success
install-pkg (ubuntu-22.04, app, 3.10) success
install-pkg (ubuntu-22.04, fabric, 3.8) success
install-pkg (ubuntu-22.04, fabric, 3.10) success
install-pkg (ubuntu-22.04, pytorch, 3.8) success
install-pkg (ubuntu-22.04, pytorch, 3.10) success
install-pkg (ubuntu-22.04, lightning, 3.8) success
install-pkg (ubuntu-22.04, lightning, 3.10) success
install-pkg (ubuntu-22.04, notset, 3.8) success
install-pkg (ubuntu-22.04, notset, 3.10) success
install-pkg (macOS-12, app, 3.8) success
install-pkg (macOS-12, app, 3.10) success
install-pkg (macOS-12, fabric, 3.8) success
install-pkg (macOS-12, fabric, 3.10) success
install-pkg (macOS-12, pytorch, 3.8) success
install-pkg (macOS-12, pytorch, 3.10) success
install-pkg (macOS-12, lightning, 3.8) success
install-pkg (macOS-12, lightning, 3.10) success
install-pkg (macOS-12, notset, 3.8) success
install-pkg (macOS-12, notset, 3.10) success
install-pkg (windows-2022, app, 3.8) success
install-pkg (windows-2022, app, 3.10) success
install-pkg (windows-2022, fabric, 3.8) success
install-pkg (windows-2022, fabric, 3.10) success
install-pkg (windows-2022, pytorch, 3.8) success
install-pkg (windows-2022, pytorch, 3.10) success
install-pkg (windows-2022, lightning, 3.8) success
install-pkg (windows-2022, lightning, 3.10) success
install-pkg (windows-2022, notset, 3.8) success
install-pkg (windows-2022, notset, 3.10) success

These checks are required after the changes to src/lightning/pytorch/strategies/deepspeed.py, src/lightning/pytorch/strategies/fsdp.py.


Thank you for your contribution! 💜

Note
This comment is automatically generated and updates for 60 minutes every 180 seconds. If you have any other questions, contact carmocca for help.

@awaelchli awaelchli changed the title Support distributed checkpoints for FSDP in Trainer Support loading distributed checkpoints for FSDP in Trainer Aug 23, 2023
Copy link
Contributor

@carmocca carmocca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! It looks just as in Fabric now

src/lightning/pytorch/strategies/fsdp.py Outdated Show resolved Hide resolved
src/lightning/pytorch/strategies/fsdp.py Outdated Show resolved Hide resolved
@mergify mergify bot added the ready PRs ready to be merged label Aug 23, 2023
@awaelchli awaelchli merged commit fc6f43f into master Aug 23, 2023
@awaelchli awaelchli deleted the feature/dist-checkpoint-fsdp branch August 23, 2023 17:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
checkpointing Related to checkpointing feature Is an improvement or enhancement pl Generic label for PyTorch Lightning package ready PRs ready to be merged strategy: fsdp Fully Sharded Data Parallel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants