-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[TPU] Proper half-precision implementation for XLA #18213
Conversation
for more information, see https://pre-commit.ci
…o bugfix/xla-precision-plugin
⚡ Required checks status: All passing 🟢Groups summary🟢 pytorch_lightning: Tests workflow
These checks are required after the changes to 🟢 pytorch_lightning: Azure GPU
These checks are required after the changes to 🟢 pytorch_lightning: Benchmarks
These checks are required after the changes to 🟢 fabric: Docs
These checks are required after the changes to 🟢 pytorch_lightning: Docs
These checks are required after the changes to 🟢 lightning_fabric: CPU workflowThese checks are required after the changes to 🟢 lightning_fabric: Azure GPU
These checks are required after the changes to 🟢 mypy
These checks are required after the changes to 🟢 installThese checks are required after the changes to 🟢 link-check
These checks are required after the changes to Thank you for your contribution! 💜
|
for more information, see https://pre-commit.ci
…o bugfix/xla-precision-plugin
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In a follow-up we should add support for casting the model weights to half too.
My interpretation was that this happens automatically as described here but for completeness, we can add the conversion anyway if it doesn't hurt. |
for more information, see https://pre-commit.ci
else: | ||
self._desired_dtype = torch.float32 | ||
|
||
def convert_input(self, data: Any) -> Any: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this be removed too? I don't think the Trainer tests were failing due to a bug in the Trainer. Fabric should be impacted too with this change and the Fabric test coverage probably doesn't include such failure
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably yes, i can add an integration test if there is not already to find out
What does this PR do?
Fixes #18172
Successful test run
https://github.com/Lightning-AI/lightning/actions/runs/5802318415/job/15728813744?pr=18213
Before submitting
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
cc @Borda @carmocca @JackCaoG @steventk-g @Liyang90 @justusschock @awaelchli