Multi-Task Pruning for Semantic Segmentation Networks

07/16/2020
by   Xinghao Chen, et al.
5

This paper focuses on channel pruning for semantic segmentation networks. There are a large number of works to compress and accelerate deep neural networks in the classification task (e.g., ResNet-50 on ImageNet), but they cannot be straightforwardly applied to the semantic segmentation network that involves an implicit multi-task learning problem. To boost the segmentation performance, the backbone of semantic segmentation network is often pre-trained on a large scale classification dataset (e.g., ImageNet), and then optimized on the desired segmentation dataset. Hence to identify the redundancy in segmentation networks, we present a multi-task channel pruning approach. The importance of each convolution filter w.r.t the channel of an arbitrary layer will be simultaneously determined by the classification and segmentation tasks. In addition, we develop an alternative scheme for optimizing importance scores of filters in the entire network. Experimental results on several benchmarks illustrate the superiority of the proposed algorithm over the state-of-the-art pruning methods. Notably, we can obtain an about 2× FLOPs reduction on DeepLabv3 with only an about 1% mIoU drop on the PASCAL VOC 2012 dataset and an about 1.3% mIoU drop on Cityscapes dataset, respectively.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset