-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Features] Add logger for initialization of parameters #1150
Conversation
Codecov Report
@@ Coverage Diff @@
## master #1150 +/- ##
==========================================
+ Coverage 68.00% 68.28% +0.27%
==========================================
Files 160 160
Lines 10443 10590 +147
Branches 1902 1935 +33
==========================================
+ Hits 7102 7231 +129
- Misses 2968 2977 +9
- Partials 373 382 +9
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
@MeowZheng & @nbei may take a look. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Motivation
I add an initialization log to record the initialization type of all the parameters. This can help user to use the
init_cfg
in theBaseModule
.Modification
Add
params_init_info
to all modules from top-most level module(such asDetector
of mmdetection andClassifier
of mmclassification), in mmcv/runner/base_module.py. Theparams_init_info
is a dict, each key is a parameter in the model, and the value is a dict with three keys.params_name(str)
: the name of parameterinit_info(str)
: The initialization information of the parameter, begin with the type of initialization inINITIALIZERS
such as
XavierInit
,KaimingInit
and follow with the arguments of initialization, such asSometimes we don't specify the initialization type or the initialized value is exactly equal to the default initialization of PyTorch, and some BaseModule may call the
init_weights
during__init__
of the model.At last, If you overload the
init_weights
ofBaseModule
and do a self-defined initialization for some parameterstmp_mean_value
(obj:torch.FloatTensor
): Current mean value of the parameter, the value would be updated after initialization inINITIALIZERS
, when we find it is different with the real mean of parameter, It means that the user initializes the parameter with self-definedinit_weights
instead ofinit_cfg
.BC-breaking
None
The initialization information would only be dumped into the log file.
Below is the part of the log msg of RetinaNet initialization.