-
Notifications
You must be signed in to change notification settings - Fork 212
[feat] Add support for schedulers #232
Conversation
Codecov Report
@@ Coverage Diff @@
## master #232 +/- ##
=======================================
Coverage 86.56% 86.56%
=======================================
Files 57 58 +1
Lines 2843 2910 +67
=======================================
+ Hits 2461 2519 +58
- Misses 382 391 +9
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
else: | ||
dataset_size = len(self.train_dataloader()) | ||
|
||
num_devices = max(1, self.trainer.num_gpus, self.trainer.num_processes) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe someone else can confirm, but I think num_gpus
is automatically set to num_processes
now, so this isn't needed, we should be able to do just num_devices = num_processes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, might want to get someone else who hasn't seen this code to also review :)
flash/core/schedulers.py
Outdated
|
||
if _TRANSFORMERS_AVAILABLE: | ||
from transformers import optimization | ||
functions = [getattr(optimization, n) for n in dir(optimization) if ("get_" in n and n != 'get_scheduler')] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
functions = [getattr(optimization, n) for n in dir(optimization) if ("get_" in n and n != 'get_scheduler')] | |
functions: List[Callable] = [getattr(optimization, n) for n in dir(optimization) if ("get_" in n and n != 'get_scheduler')] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall, looks good 😃 Can you maybe add if _TRANSFORMERS_AVAILABLE
to the other files that use transformers (e.g. /text/seq2seq/summarization/data.py
)?
What does this PR do?
This PR add a better support for optimizer and schedulers.
Fixes # (issue)
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃