-
Notifications
You must be signed in to change notification settings - Fork 9.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WARNING - The model and loaded state dict do not match exactly UserWarning:DeprecationWarning:pretrained is deprecated,please use “init_cfg” instead #5177
Comments
17 days ago new mmdetection version was released (https://github.com/open-mmlab/mmdetection/releases/tag/v2.12.0 <- second list item in Backwards Incompatible Changes).
Actually, you are supposed to add override to init_cfg, but it works okay this way. And then, in your train script add the following:
|
I have the same issue and thanks for your reply! I guess I need to specify" layer key" in my " init_cfg", but I have totally no clue about how to do that. |
You have to add an override key as the warning says. But I have no clue how to do it for pretrained part of init_cfg. I've checked official docs and configs and have found nothing (even mmdet's github configs have depreciated pretrained key in model dict). In other init_cfgs override is used for overriding init_cfg layers, but our case has no layers (if we use only pretrained). In either case,
doesn't harm training and validating models, so just ignore it. |
I see I see. It is hard for beginners like me to figure out how to modify the init_cfg without official documents and example, and all I need is just to use a pretrained model. The most related official doc maybe open-mmlab/mmcv#780, but still, it is not helpful for me. Thanks a lot for your feedback! |
The warning says two things:
So this warning is just a result of little logic misconduct (the warning shouldn't be shown for pretrained init_cfg dicts) |
Thanks for your further explanation, right now I more or less understand what happens. So if I want to have no pretrained model (random initialization), I should use "init_cfg=dict(type='Pretrained', checkpoint= None),)," ? |
If you don't want to have pretrained model weights (e.g. for ResNet) you can just remove dict with type='Pretrained' from init_cfg (or even init_cfg if it contains only pretrained dict).
That's right, not sure about freeze stages and paramwise_cfg (I've never specified these), but looks right. |
GREAT!!!! Thanks for your reply! Otherwise I hardly could continue my work! |
UserWarning: init_cfg without layer key, if you do not define override key either, this init_cfg will do nothing
'init_cfg without layer key, if you do not define override'
WARNING - The model and loaded state dict do not match exactly UserWarning:DeprecationWarning:pretrained is deprecated,please use “init_cfg” instead
The text was updated successfully, but these errors were encountered: