Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add lazy checkpoint loading for FSDP full-state checkpoints (Trainer) #18379

Merged
merged 5 commits into from
Aug 24, 2023

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Aug 24, 2023

What does this PR do?

Fixes #8043
Follows the same logic as added in #18150

Currently, loading a full checkpoint means replicating it in CPU memory for each process. This can cause CPU OOM on machines with limited CPU RAM. Instead, the approach of this PR is to lazy-load the checkpoint (no memory allocation) and on-the-fly load tensors as they are needed. This reduces the memory usage to only one weight tensor being stored in memory (per process) at a given time during model.load_state_dict() access.

cc @Borda @awaelchli @carmocca

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Aug 24, 2023
@awaelchli awaelchli added feature Is an improvement or enhancement checkpointing Related to checkpointing strategy: fsdp Fully Sharded Data Parallel and removed pl Generic label for PyTorch Lightning package labels Aug 24, 2023
@awaelchli awaelchli modified the milestones: 2.0.x, 2.1 Aug 24, 2023
@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Aug 24, 2023
@awaelchli awaelchli force-pushed the feature/fsdp-lazy-load branch from 64218b8 to a2c8780 Compare August 24, 2023 09:54
@awaelchli awaelchli changed the title WIP: FSDP lazy load full state checkpoint FSDP lazy load full state checkpoint Aug 24, 2023
@awaelchli awaelchli changed the title FSDP lazy load full state checkpoint Add lazy checkpoint loading for FSDP full-state checkpoints (Trainer) Aug 24, 2023
@awaelchli awaelchli marked this pull request as ready for review August 24, 2023 10:03
@github-actions
Copy link
Contributor

github-actions bot commented Aug 24, 2023

⚡ Required checks status: All passing 🟢

Groups summary

🟢 pytorch_lightning: Tests workflow
Check ID Status
pl-cpu (macOS-11, lightning, 3.8, 1.11) success
pl-cpu (macOS-11, lightning, 3.9, 1.12) success
pl-cpu (macOS-11, lightning, 3.10, 1.13) success
pl-cpu (macOS-11, lightning, 3.10, 2.0) success
pl-cpu (macOS-11, lightning, 3.8, 1.11, oldest) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11) success
pl-cpu (ubuntu-20.04, lightning, 3.9, 1.12) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 1.13) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 2.0) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11, oldest) success
pl-cpu (windows-2022, lightning, 3.8, 1.11) success
pl-cpu (windows-2022, lightning, 3.9, 1.12) success
pl-cpu (windows-2022, lightning, 3.10, 1.13) success
pl-cpu (windows-2022, lightning, 3.10, 2.0) success
pl-cpu (windows-2022, lightning, 3.8, 1.11, oldest) success
pl-cpu (macOS-11, pytorch, 3.8, 1.13) success
pl-cpu (ubuntu-20.04, pytorch, 3.8, 1.13) success
pl-cpu (windows-2022, pytorch, 3.8, 1.13) success

These checks are required after the changes to src/lightning/pytorch/strategies/fsdp.py, tests/tests_pytorch/strategies/test_fsdp.py.

🟢 pytorch_lightning: Azure GPU
Check ID Status
[pytorch-lightning (GPUs) (testing Lightning latest)](https://dev.azure.com/Lightning-AI/72ab7ed8-b00f-4b6e-b131-3388f7ffafa7/_build/results?buildId=171269&view=logs&jobId=47e66f3c-897a-5428-da11-bf5c7745762e) success
[pytorch-lightning (GPUs) (testing PyTorch latest)](https://dev.azure.com/Lightning-AI/72ab7ed8-b00f-4b6e-b131-3388f7ffafa7/_build/results?buildId=171269&view=logs&jobId=3f274fac-2e11-54ca-487e-194c91f3ae9f) success

These checks are required after the changes to src/lightning/pytorch/strategies/fsdp.py, tests/tests_pytorch/strategies/test_fsdp.py.

🟢 pytorch_lightning: Benchmarks
Check ID Status
lightning.Benchmarks success

These checks are required after the changes to src/lightning/pytorch/strategies/fsdp.py.

🟢 pytorch_lightning: Docs
Check ID Status
docs-checks (pytorch, doctest) success
make-html (pytorch) success

These checks are required after the changes to src/lightning/pytorch/strategies/fsdp.py.

🟢 mypy
Check ID Status
mypy success

These checks are required after the changes to src/lightning/pytorch/strategies/fsdp.py.

🟢 install
Check ID Status
install-pkg (ubuntu-22.04, app, 3.8) success
install-pkg (ubuntu-22.04, app, 3.10) success
install-pkg (ubuntu-22.04, fabric, 3.8) success
install-pkg (ubuntu-22.04, fabric, 3.10) success
install-pkg (ubuntu-22.04, pytorch, 3.8) success
install-pkg (ubuntu-22.04, pytorch, 3.10) success
install-pkg (ubuntu-22.04, lightning, 3.8) success
install-pkg (ubuntu-22.04, lightning, 3.10) success
install-pkg (ubuntu-22.04, notset, 3.8) success
install-pkg (ubuntu-22.04, notset, 3.10) success
install-pkg (macOS-12, app, 3.8) success
install-pkg (macOS-12, app, 3.10) success
install-pkg (macOS-12, fabric, 3.8) success
install-pkg (macOS-12, fabric, 3.10) success
install-pkg (macOS-12, pytorch, 3.8) success
install-pkg (macOS-12, pytorch, 3.10) success
install-pkg (macOS-12, lightning, 3.8) success
install-pkg (macOS-12, lightning, 3.10) success
install-pkg (macOS-12, notset, 3.8) success
install-pkg (macOS-12, notset, 3.10) success
install-pkg (windows-2022, app, 3.8) success
install-pkg (windows-2022, app, 3.10) success
install-pkg (windows-2022, fabric, 3.8) success
install-pkg (windows-2022, fabric, 3.10) success
install-pkg (windows-2022, pytorch, 3.8) success
install-pkg (windows-2022, pytorch, 3.10) success
install-pkg (windows-2022, lightning, 3.8) success
install-pkg (windows-2022, lightning, 3.10) success
install-pkg (windows-2022, notset, 3.8) success
install-pkg (windows-2022, notset, 3.10) success

These checks are required after the changes to src/lightning/pytorch/strategies/fsdp.py.


Thank you for your contribution! 💜

Note
This comment is automatically generated and updates for 60 minutes every 180 seconds. If you have any other questions, contact carmocca for help.

Copy link
Contributor

@carmocca carmocca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This feature will likely be shortlived, but it's fine to add until the replacement is in

@mergify mergify bot added the ready PRs ready to be merged label Aug 24, 2023
@awaelchli awaelchli merged commit e8f3863 into master Aug 24, 2023
@awaelchli awaelchli deleted the feature/fsdp-lazy-load branch August 24, 2023 13:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
checkpointing Related to checkpointing feature Is an improvement or enhancement pl Generic label for PyTorch Lightning package ready PRs ready to be merged strategy: fsdp Fully Sharded Data Parallel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

OOM issues with loading large model checkpoints w/ FSDP after checkpoint refactor
3 participants