-
Notifications
You must be signed in to change notification settings - Fork 212
Add support for Torch ORT to Transformer based Tasks #667
Conversation
# Conflicts: # flash/core/utilities/imports.py # flash/text/classification/model.py
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
Codecov Report
@@ Coverage Diff @@
## master #667 +/- ##
==========================================
- Coverage 90.01% 89.30% -0.71%
==========================================
Files 185 186 +1
Lines 9664 9696 +32
==========================================
- Hits 8699 8659 -40
- Misses 965 1037 +72
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, looks really neat 😃 Do we need to add torch ORT to the text requirements if it will be enabled by default? Also, I guess the docs section needs to be added to the summarization task too?
I just realised that the only way to test is with an NVIDIA GPU or AMD GPU. Do we test Flash at all on GPU? cc @ethanwharris |
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, LGTM 😃
What does this PR do?
Adds Torch ORT support to Transformer based Flash Tasks!
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃