Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

Pretrained flag and resnet50 pretrained weights #560

Merged
merged 23 commits into from
Jul 12, 2021
Merged

Conversation

ananyahjha93
Copy link
Contributor

@ananyahjha93 ananyahjha93 commented Jul 9, 2021

What does this PR do?

  1. Change pretrained flag in ImageClassifier to load self-supervised pretrained weights for backbones
  2. Add pretrained weights of simclr, swav, barlow-twins for resnet50 backbone from VISSL.
  3. Removes dependency on bolts for pretrained SSL weights.

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests? [not needed for typos/docs]
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@pep8speaks
Copy link

pep8speaks commented Jul 9, 2021

Hello @ananyahjha93! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2021-07-12 14:39:26 UTC

@codecov
Copy link

codecov bot commented Jul 9, 2021

Codecov Report

Merging #560 (d15d66b) into master (48bdfd8) will decrease coverage by 0.06%.
The diff coverage is 75.86%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #560      +/-   ##
==========================================
- Coverage   91.33%   91.27%   -0.07%     
==========================================
  Files         113      113              
  Lines        7202     7206       +4     
==========================================
- Hits         6578     6577       -1     
- Misses        624      629       +5     
Flag Coverage Δ
unittests 91.27% <75.86%> (-0.07%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
flash/image/classification/model.py 78.84% <28.57%> (-7.83%) ⬇️
flash/image/backbones.py 86.00% <90.90%> (-0.41%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 48bdfd8...d15d66b. Read the comment docs.

CHANGELOG.md Outdated Show resolved Hide resolved
CHANGELOG.md Show resolved Hide resolved
flash/image/backbones.py Outdated Show resolved Hide resolved
flash/image/backbones.py Show resolved Hide resolved
flash/image/backbones.py Outdated Show resolved Hide resolved
flash/image/backbones.py Show resolved Hide resolved
flash/image/backbones.py Show resolved Hide resolved
flash/image/backbones.py Outdated Show resolved Hide resolved
Copy link
Collaborator

@ethanwharris ethanwharris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good 😃 I think there's some slightly faulty logic in the load function, maybe some tests for different pretrained configurations would be good

flash/image/backbones.py Outdated Show resolved Hide resolved
flash/image/backbones.py Outdated Show resolved Hide resolved
flash/image/backbones.py Outdated Show resolved Hide resolved
flash/image/backbones.py Outdated Show resolved Hide resolved
@ananyahjha93
Copy link
Contributor Author

@ethanwharris is there a way to cache the pretrained weights? Adding tests for pretrained configurations will add significant time to tests since large models need to be downloaded before testing. Hence I wasn't sure to include this.

@ethanwharris
Copy link
Collaborator

ethanwharris commented Jul 12, 2021

@ananyahjha93 I think they will get cached automatically but not sure. We should still unit test the function though, you could use patch to prevent things from being downloaded if it is an issue

@ananyahjha93 ananyahjha93 enabled auto-merge (squash) July 12, 2021 14:50
@Borda Borda added the bug / fix Something isn't working label Jul 12, 2021
@ethanwharris ethanwharris added enhancement New feature or request and removed bug / fix Something isn't working labels Jul 12, 2021
Copy link
Collaborator

@ethanwharris ethanwharris left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, LGTM 😃

@ananyahjha93 ananyahjha93 merged commit bf1526f into master Jul 12, 2021
@ananyahjha93 ananyahjha93 deleted the simclr-strategy branch July 12, 2021 16:34
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants