Channel Pruning via Multi-Criteria based on Weight Dependency

11/06/2020
by   Yangchun Yan, et al.
0

Channel pruning has demonstrated its effectiveness in compressing ConvNets. In many prior arts, the importance of an output feature map is only determined by its associated filter. However, these methods ignore a small part of weights in the next layer which disappear as the feature map is removed. They ignore the dependency of the weights, so that, a part of weights are pruned without being evaluated. In addition, many pruning methods use only one criterion for evaluation, and find a sweet-spot of pruning structure and accuracy in a trial-and-error fashion, which can be time-consuming. To address the above issues, we proposed a channel pruning algorithm via multi-criteria based on weight dependency, CPMC, which can compress a variety of models efficiently. We design the importance of the feature map in three aspects, including its associated weight value, computational cost and parameter quantity. Use the phenomenon of weight dependency, We get the importance by assessing its associated filter and the corresponding partial weights of the next layer. Then we use global normalization to achieve cross-layer comparison. Our method can compress various CNN models, including VGGNet, ResNet and DenseNet, on various image classification datasets. Extensive experiments have shown CPMC outperforms the others significantly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2020

Channel Pruning Guided by Classification Loss and Feature Importance

In this work, we propose a new layer-by-layer channel pruning method cal...
research
10/14/2020

Towards Optimal Filter Pruning with Balanced Performance and Pruning Speed

Filter pruning has drawn more attention since resource constrained platf...
research
02/15/2022

Pruning Networks with Cross-Layer Ranking k-Reciprocal Nearest Filters

This paper focuses on filter-level network pruning. A novel pruning meth...
research
06/25/2019

COP: Customized Deep Model Compression via Regularized Correlation-Based Filter-Level Pruning

Neural network compression empowers the effective yet unwieldy deep conv...
research
08/13/2022

Entropy Induced Pruning Framework for Convolutional Neural Networks

Structured pruning techniques have achieved great compression performanc...
research
12/05/2021

Inf-CP: A Reliable Channel Pruning based on Channel Influence

One of the most effective methods of channel pruning is to trim on the b...
research
05/28/2020

A Feature-map Discriminant Perspective for Pruning Deep Neural Networks

Network pruning has become the de facto tool to accelerate deep neural n...

Please sign up or login with your details

Forgot password? Click here to reset