-
Notifications
You must be signed in to change notification settings - Fork 1.8k
[Model Compression] Update api of iterative pruners #3507
Conversation
@@ -1,14 +1,13 @@ | |||
Supported Pruning Algorithms on NNI | |||
=================================== | |||
|
|||
We provide several pruning algorithms that support fine-grained weight pruning and structural filter pruning. **Fine-grained Pruning** generally results in unstructured models, which need specialized hardware or software to speed up the sparse network. **Filter Pruning** achieves acceleration by removing the entire filter. Some pruning algorithms use one-shot method that prune weights at once based on an importance metric. Other pruning algorithms control the **pruning schedule** that prune weights during optimization, including some automatic pruning algorithms. | |||
We provide several pruning algorithms that support fine-grained weight pruning and structural filter pruning. **Fine-grained Pruning** generally results in unstructured models, which need specialized hardware or software to speed up the sparse network. **Filter Pruning** achieves acceleration by removing the entire filter. Some pruning algorithms use one-shot method that prune weights at once based on an importance metric (It is necessary to finetune the model to compensate for the loss of accuracy). Other pruning algorithms **iteratively** prune weights during optimization, which control the pruning schedule, including some automatic pruning algorithms. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Filter Pruning -> One-shot Pruning
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Filter Pruning and Fine-grained Pruning represent the alternative methods from the perspective of pruning granularity.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
got it, so we have two categories, (Filter Pruning and Fine-grained Pruning), (Iteratively Pruning and One-shot Pruning)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is the difference between "filter pruning" and "channel pruning"? "channel pruning" is also a type of "filter pruning"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is the difference between "filter pruning" and "channel pruning"? "channel pruning" is also a type of "filter pruning"?
Their selection rules are different, one is based on the weight of the filter(out-channel), the other is based on the weight of the input-channel. Filter pruning prunes the output channel of the conv layers and the channel pruning prunes the input channel of the conv layers. The have the same effect after the propagation( pruning an input channel is equivalent to pruning a filter of the previous conv) .
Need follow-up:
|
This pr reconstructs the category of pruners and improves the examples for model compression.