Self-grouping Convolutional Neural Networks

09/29/2020
by   Qingbei Guo, et al.
0

Although group convolution operators are increasingly used in deep convolutional neural networks to improve the computational efficiency and to reduce the number of parameters, most existing methods construct their group convolution architectures by a predefined partitioning of the filters of each convolutional layer into multiple regular filter groups with an equal spatial group size and data-independence, which prevents a full exploitation of their potential. To tackle this issue, we propose a novel method of designing self-grouping convolutional neural networks, called SG-CNN, in which the filters of each convolutional layer group themselves based on the similarity of their importance vectors. Concretely, for each filter, we first evaluate the importance value of their input channels to identify the importance vectors, and then group these vectors by clustering. Using the resulting data-dependent centroids, we prune the less important connections, which implicitly minimizes the accuracy loss of the pruning, thus yielding a set of diverse group convolution filters. Subsequently, we develop two fine-tuning schemes, i.e. (1) both local and global fine-tuning and (2) global only fine-tuning, which experimentally deliver comparable results, to recover the recognition capacity of the pruned network. Comprehensive experiments carried out on the CIFAR-10/100 and ImageNet datasets demonstrate that our self-grouping convolution method adapts to various state-of-the-art CNN architectures, such as ResNet and DenseNet, and delivers superior performance in terms of compression ratio, speedup and recognition accuracy. We demonstrate the ability of SG-CNN to generalise by transfer learning, including domain adaption and object detection, showing competitive results. Our source code is available at https://github.com/QingbeiGuo/SG-CNN.git.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2020

OrthoReg: Robust Network Pruning Using Orthonormality Regularization

Network pruning in Convolutional Neural Networks (CNNs) has been extensi...
research
05/17/2023

Adaptive aggregation of Monte Carlo augmented decomposed filters for efficient group-equivariant convolutional neural network

Filter-decomposition-based group-equivariant convolutional neural networ...
research
10/17/2018

Pruning Deep Neural Networks using Partial Least Squares

To handle the high computational cost in deep convolutional networks, re...
research
05/13/2019

VGG Fine-tuning for Cooking State Recognition

An important task that domestic robots need to achieve is the recognitio...
research
04/08/2019

Improving Image Classification Robustness through Selective CNN-Filters Fine-Tuning

Image quality plays a big role in CNN-based image classification perform...
research
04/09/2020

Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

In a deep neural network (DNN), the number of the parameters is usually ...
research
10/17/2022

Approximating Continuous Convolutions for Deep Network Compression

We present ApproxConv, a novel method for compressing the layers of a co...

Please sign up or login with your details

Forgot password? Click here to reset