-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix state dict loading in bitsandbytes plugin when checkpoint is already quantized #19886
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #19886 +/- ##
=========================================
- Coverage 84% 59% -25%
=========================================
Files 426 421 -5
Lines 35233 35135 -98
=========================================
- Hits 29501 20716 -8785
- Misses 5732 14419 +8687 |
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great find, thank you @awaelchli
What does this PR do?
Fixes #19271
Since the Trainer directly imports the plugin from Fabric, this fix automatically applies there too.
Checklist:
Internal discussion: https://pytorch-lightning.slack.com/archives/C05NRK0DH34/p1716296573578039
📚 Documentation preview 📚: https://pytorch-lightning--19886.org.readthedocs.build/en/19886/
cc @Borda @carmocca @justusschock @awaelchli