-
Notifications
You must be signed in to change notification settings - Fork 212
Pretrained flag and resnet50 pretrained weights #560
Conversation
Hello @ananyahjha93! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-07-12 14:39:26 UTC |
Codecov Report
@@ Coverage Diff @@
## master #560 +/- ##
==========================================
- Coverage 91.33% 91.27% -0.07%
==========================================
Files 113 113
Lines 7202 7206 +4
==========================================
- Hits 6578 6577 -1
- Misses 624 629 +5
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good 😃 I think there's some slightly faulty logic in the load function, maybe some tests for different pretrained
configurations would be good
fc65fa5
to
4dda19f
Compare
for more information, see https://pre-commit.ci
@ethanwharris is there a way to cache the pretrained weights? Adding tests for pretrained configurations will add significant time to tests since large models need to be downloaded before testing. Hence I wasn't sure to include this. |
@ananyahjha93 I think they will get cached automatically but not sure. We should still unit test the function though, you could use |
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, LGTM 😃
What does this PR do?
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃