Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Detection recipe enhancements #5715

Merged
merged 3 commits into from
Apr 1, 2022

Conversation

datumbox
Copy link
Contributor

@datumbox datumbox commented Mar 31, 2022

Fixes #5307

This PR cherrypicks some of the improvements from #5444 to simplify review.

Improve the existing detection recipes:

  • Add support of LSJ and multiscale augmentation strategies
  • Add --opt and --norm-weight-decay support in Detection
  • Add support of InstanceNorm and LocalResponseNorm layers in split_normalization_params
  • Other minor nit fixes across references

@facebook-github-bot
Copy link

facebook-github-bot commented Mar 31, 2022

💊 CI failures summary and remediations

As of commit 162f76d (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

Copy link
Contributor Author

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding a few comments to clarify what is what.

references/classification/train.py Show resolved Hide resolved
references/detection/presets.py Show resolved Hide resolved
references/detection/presets.py Show resolved Hide resolved
references/detection/train.py Show resolved Hide resolved
references/detection/train.py Show resolved Hide resolved
elif opt_name == "adamw":
optimizer = torch.optim.AdamW(parameters, lr=args.lr, weight_decay=args.weight_decay)
else:
raise RuntimeError(f"Invalid optimizer {args.opt}. Only SGD and AdamW are supported.")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Straight copy-paste from classification.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since is copy-pasted I think it can stay as is and if we want to change we can do in a different PR, but still wonder if there is a reson why for sgd we use starts_with and for adamw we use ==?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right! I've indeed copy-pasted it from classification but replaced the previous sgd optimizer line. The problem is that the existing recipe didn't contain the nesterov momentum update. I've just updated the file to support it; it's not something I used so far but it's a simple update.

torchvision/models/detection/faster_rcnn.py Show resolved Hide resolved
torchvision/models/detection/transform.py Show resolved Hide resolved
torchvision/ops/_utils.py Show resolved Hide resolved
Copy link
Contributor

@jdsgomes jdsgomes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

elif opt_name == "adamw":
optimizer = torch.optim.AdamW(parameters, lr=args.lr, weight_decay=args.weight_decay)
else:
raise RuntimeError(f"Invalid optimizer {args.opt}. Only SGD and AdamW are supported.")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since is copy-pasted I think it can stay as is and if we want to change we can do in a different PR, but still wonder if there is a reson why for sgd we use starts_with and for adamw we use ==?

@@ -64,7 +64,6 @@ def test_get_weight(name, weight):
)
def test_naming_conventions(model_fn):
weights_enum = _get_model_weights(model_fn)
print(weights_enum)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

profit!

@datumbox datumbox merged commit d59398b into pytorch:main Apr 1, 2022
@datumbox datumbox deleted the references/recipe_detection branch April 1, 2022 07:36
@datumbox datumbox mentioned this pull request Apr 1, 2022
24 tasks
facebook-github-bot pushed a commit that referenced this pull request Apr 6, 2022
Summary:
* Detection recipe enhancements

* Add back nesterov momentum

Reviewed By: NicolasHug

Differential Revision: D35393165

fbshipit-source-id: 32156eddebfcb71e708be2b2a7b7ccef1e48c88e
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improve the accuracy of Detection & Segmentation models by using SOTA recipes and primitives
3 participants