-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Conversation
@@ -145,7 +145,10 @@ def create_model(model_name='naive'): | |||
def create_pruner(model, pruner_name, optimizer=None): | |||
pruner_class = prune_config[pruner_name]['pruner_class'] | |||
config_list = prune_config[pruner_name]['config_list'] | |||
return pruner_class(model, config_list, optimizer) | |||
if pruner_name in ['level', 'slim', 'fpgm', 'l1filter']: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why keep optimizer parameter for ActivationMeanRankFilterPruner
and ActivationAPoZRankFilterPruner
? they do not need optimizer hook either. For all one-shot pruners, it seems only TaylorFOWeightFilterPruner needs optimizer ( to calculate gradient contribution).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why keep optimizer parameter for
ActivationMeanRankFilterPruner
andActivationAPoZRankFilterPruner
? they do not need optimizer hook either. For all one-shot pruners, it seems only TaylorFOWeightFilterPruner needs optimizer ( to calculate gradient contribution).
In the current implementation, ActivationMeanRankFilterPruner
and ActivationAPoZRankFilterPruner
need the optimizer hook. If we remove the optimizer param, the sparsity after pruning will always be 0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as discussed in the meeting, let's not do this broken change?
(cherry picked from commit d0a9b10)
Parameter 'optimizer ' was removed from Pruners ('level', 'slim', 'fpgm', 'l1', 'l2') in PR #2676. This example should be changed accordingly.