-
Notifications
You must be signed in to change notification settings - Fork 1.8k
[Model Compression] Update api of iterative pruners #3507
Changes from all commits
b039897
ebadd81
0bd64b6
9f48741
0002cfd
24402d6
1a1b787
b657744
e2314e1
a58acf8
c8a3367
f7ea6ab
56ee838
4f4e943
ab5a850
e14be2c
749a28c
9e17143
40c918b
8bf9965
7c322ce
4a03ab9
f52fcf7
a6869a2
17423f0
182a034
876c38a
c5c40bb
6fb4eb8
2eae6a5
9fcc7a9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -34,7 +34,7 @@ Weight Masker | |
.. autoclass:: nni.algorithms.compression.pytorch.pruning.weight_masker.WeightMasker | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.structured_pruning.StructuredWeightMasker | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.structured_pruning_masker.StructuredWeightMasker | ||
:members: | ||
|
||
|
||
|
@@ -43,40 +43,40 @@ Pruners | |
.. autoclass:: nni.algorithms.compression.pytorch.pruning.sensitivity_pruner.SensitivityPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.OneshotPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot_pruner.OneshotPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.LevelPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot_pruner.LevelPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.SlimPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot_pruner.L1FilterPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.L1FilterPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot_pruner.L2FilterPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.L2FilterPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot_pruner.FPGMPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.FPGMPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.iterative_pruner.IterativePruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.TaylorFOWeightFilterPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.iterative_pruner.SlimPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.ActivationAPoZRankFilterPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.iterative_pruner.TaylorFOWeightFilterPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.one_shot.ActivationMeanRankFilterPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.iterative_pruner.ActivationAPoZRankFilterPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.lottery_ticket.LotteryTicketPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.iterative_pruner.ActivationMeanRankFilterPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.agp.AGPPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.iterative_pruner.AGPPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.admm_pruner.ADMMPruner | ||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.iterative_pruner.ADMMPruner | ||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.auto_compress_pruner.AutoCompressPruner | ||
|
@@ -88,6 +88,9 @@ Pruners | |
.. autoclass:: nni.algorithms.compression.pytorch.pruning.simulated_annealing_pruner.SimulatedAnnealingPruner | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why are some pruners neither in the one-shot pruner nor in the iterative category? I think it's a little bit confusing. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Actually, some pruners (e.g., |
||
:members: | ||
|
||
.. autoclass:: nni.algorithms.compression.pytorch.pruning.lottery_ticket.LotteryTicketPruner | ||
:members: | ||
|
||
|
||
Quantizers | ||
^^^^^^^^^^ | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,15 +1,11 @@ | ||
Supported Pruning Algorithms on NNI | ||
=================================== | ||
|
||
We provide several pruning algorithms that support fine-grained weight pruning and structural filter pruning. **Fine-grained Pruning** generally results in unstructured models, which need specialized hardware or software to speed up the sparse network. **Filter Pruning** achieves acceleration by removing the entire filter. Some pruning algorithms use one-shot method that prune weights at once based on an importance metric. Other pruning algorithms control the **pruning schedule** that prune weights during optimization, including some automatic pruning algorithms. | ||
We provide several pruning algorithms that support fine-grained weight pruning and structural filter pruning. **Fine-grained Pruning** generally results in unstructured models, which need specialized hardware or software to speed up the sparse network. **Filter Pruning** achieves acceleration by removing the entire filter. Some pruning algorithms use one-shot method that prune weights at once based on an importance metric (It is necessary to finetune the model to compensate for the loss of accuracy). Other pruning algorithms **iteratively** prune weights during optimization, which control the pruning schedule, including some automatic pruning algorithms. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Filter Pruning -> One-shot Pruning There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Filter Pruning and Fine-grained Pruning represent the alternative methods from the perspective of pruning granularity. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. got it, so we have two categories, (Filter Pruning and Fine-grained Pruning), (Iteratively Pruning and One-shot Pruning) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what is the difference between "filter pruning" and "channel pruning"? "channel pruning" is also a type of "filter pruning"? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Their selection rules are different, one is based on the weight of the filter(out-channel), the other is based on the weight of the input-channel. Filter pruning prunes the output channel of the conv layers and the channel pruning prunes the input channel of the conv layers. The have the same effect after the propagation( pruning an input channel is equivalent to pruning a filter of the previous conv) . |
||
|
||
|
||
**Fine-grained Pruning** | ||
|
||
* `Level Pruner <#level-pruner>`__ | ||
|
||
**Filter Pruning** | ||
|
||
**One-shot Pruning** | ||
* `Level Pruner <#level-pruner>`__ ((fine-grained pruning)) | ||
* `Slim Pruner <#slim-pruner>`__ | ||
* `FPGM Pruner <#fpgm-pruner>`__ | ||
* `L1Filter Pruner <#l1filter-pruner>`__ | ||
|
@@ -18,18 +14,17 @@ We provide several pruning algorithms that support fine-grained weight pruning a | |
* `Activation Mean Rank Filter Pruner <#activationmeanrankfilter-pruner>`__ | ||
* `Taylor FO On Weight Pruner <#taylorfoweightfilter-pruner>`__ | ||
|
||
**Pruning Schedule** | ||
**Iteratively Pruning** | ||
|
||
* `AGP Pruner <#agp-pruner>`__ | ||
* `NetAdapt Pruner <#netadapt-pruner>`__ | ||
* `SimulatedAnnealing Pruner <#simulatedannealing-pruner>`__ | ||
* `AutoCompress Pruner <#autocompress-pruner>`__ | ||
* `AMC Pruner <#amc-pruner>`__ | ||
* `Sensitivity Pruner <#sensitivity-pruner>`__ | ||
* `ADMM Pruner <#admm-pruner>`__ | ||
|
||
**Others** | ||
|
||
* `ADMM Pruner <#admm-pruner>`__ | ||
* `Lottery Ticket Hypothesis <#lottery-ticket-hypothesis>`__ | ||
|
||
Level Pruner | ||
|
@@ -382,11 +377,7 @@ PyTorch code | |
|
||
from nni.algorithms.compression.pytorch.pruning import AGPPruner | ||
config_list = [{ | ||
'initial_sparsity': 0, | ||
'final_sparsity': 0.8, | ||
'start_epoch': 0, | ||
'end_epoch': 10, | ||
'frequency': 1, | ||
'sparsity': 0.8, | ||
'op_types': ['default'] | ||
}] | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
.pth | ||
.tar.gz | ||
data/ | ||
MNIST/ | ||
cifar-10-batches-py/ | ||
experiment_data/ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why the
AutoCompressPruner
is not classified into the iterative_pruner?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can put
AutoCompressPruner
into iterative in future