-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bitsandbytes precision plugin #18655
Conversation
⚡ Required checks status: All passing 🟢Groups summary🟢 pytorch_lightning: Tests workflowThese checks are required after the changes to 🟢 pytorch_lightning: Azure GPU
These checks are required after the changes to 🟢 pytorch_lightning: Benchmarks
These checks are required after the changes to 🟢 fabric: Docs
These checks are required after the changes to 🟢 pytorch_lightning: Docs
These checks are required after the changes to 🟢 lightning_fabric: CPU workflowThese checks are required after the changes to 🟢 lightning_fabric: Azure GPU
These checks are required after the changes to 🟢 mypy
These checks are required after the changes to 🟢 installThese checks are required after the changes to Thank you for your contribution! 💜
|
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## master #18655 +/- ##
=========================================
- Coverage 83% 53% -30%
=========================================
Files 426 423 -3
Lines 33381 33452 +71
=========================================
- Hits 27670 17857 -9813
- Misses 5711 15595 +9884 |
e39dd21
to
fa9f3a8
Compare
82f5448
to
2e54507
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🎉
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great feature. We can start testing it now with lit-gpt.
@carmocca Would you like to also open an issue about the design concern we had around composing plugins in the future?
@awaelchli Opened #18679. Also working to update lit-gpt in Lightning-AI/litgpt#596 |
What does this PR do?
Fixes #18295
Closes #18559
Some minor improvements are added to the transformer engine integration. I thought about these as I was working on this plugin, since they work very similarly.
Before:
After:
If the user request to
skip=...
submodules, quantization will only happen infabric.setup
📚 Documentation preview 📚: https://pytorch-lightning--18655.org.readthedocs.build/en/18655/
cc @Borda @carmocca @justusschock @awaelchli