Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sharded Plugin 4/n: Expose scaler in amp plugin #4737

Merged
merged 7 commits into from
Nov 18, 2020

Conversation

SeanNaren
Copy link
Contributor

What does this PR do?

Related to #4178

Allows precision plugin to manage grad scaler. This is needed for Sharded as we require a custom scaler object.

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In in short, see following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified; Bugfixes should be including in bug-fix release milestones (m.f.X) and features should be included in (m.X.b) releases.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented Nov 18, 2020

Codecov Report

Merging #4737 (1be6f23) into master (45c5760) will increase coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #4737   +/-   ##
======================================
  Coverage      93%     93%           
======================================
  Files         117     117           
  Lines        8954    8957    +3     
======================================
+ Hits         8321    8324    +3     
  Misses        633     633           

@@ -136,7 +136,7 @@ def setup_training(self, model: LightningModule):

# init amp. Must be done here instead of __init__ to allow ddp to work
if self.trainer.amp_backend == AMPType.NATIVE and self.trainer.precision == 16 and not self.trainer.use_tpu:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for a future PR let's fold these if statements into accelerators

Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Getting there :)

@Borda Borda added the feature Is an improvement or enhancement label Nov 18, 2020
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm :]

@SeanNaren SeanNaren merged commit f0ab74d into master Nov 18, 2020
@SeanNaren SeanNaren deleted the feature/817-fairscale-4n branch November 18, 2020 22:30
rohitgr7 pushed a commit that referenced this pull request Nov 21, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement refactor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants