Skip to content
This repository has been archived by the owner on Sep 1, 2023. It is now read-only.

add support for precision="16-mixed" #7

Merged
merged 1 commit into from
Feb 17, 2023
Merged

Conversation

justusschock
Copy link
Member

With Lightning-AI/pytorch-lightning#16783, precision=16 will change to precision='16-mixed'. This PR enables support for it.
A follow-up PR will disable support for precision=16 in the plugin and strategy once Lightning-AI/pytorch-lightning#16783 has landed.

Since right now nothing really changes, the follow-up PR will include the Changelog.

@justusschock justusschock requested a review from Borda as a code owner February 17, 2023 13:21
@codecov
Copy link

codecov bot commented Feb 17, 2023

Codecov Report

Merging #7 (c87e31e) into main (cd1d0db) will not change coverage.
The diff coverage is 67%.

Additional details and impacted files
@@        Coverage Diff         @@
##           main    #7   +/-   ##
==================================
  Coverage    35%   35%           
==================================
  Files         4     4           
  Lines       274   274           
==================================
  Hits         97    97           
  Misses      177   177           

@awaelchli awaelchli merged commit 0103143 into main Feb 17, 2023
@awaelchli awaelchli deleted the new_precision_vals branch February 17, 2023 13:26
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants