Recurrent Parameter Generators

07/15/2021
by   Jiayun Wang, et al.
10

We present a generic method for recurrently using the same parameters for many different convolution layers to build a deep network. Specifically, for a network, we create a recurrent parameter generator (RPG), from which the parameters of each convolution layer are generated. Though using recurrent models to build a deep convolutional neural network (CNN) is not entirely new, our method achieves significant performance gain compared to the existing works. We demonstrate how to build a one-layer neural network to achieve similar performance compared to other traditional CNN models on various applications and datasets. Such a method allows us to build an arbitrarily complex neural network with any amount of parameters. For example, we build a ResNet34 with model parameters reduced by more than 400 times, which still achieves 41.6% ImageNet top-1 accuracy. Furthermore, we demonstrate the RPG can be applied at different scales, such as layers, blocks, or even sub-networks. Specifically, we use the RPG to build a ResNet18 network with the number of weights equivalent to one convolutional layer of a conventional ResNet and show this model can achieve 67.2% ImageNet top-1 accuracy. The proposed method can be viewed as an inverse approach to model compression. Rather than removing the unused parameters from a large model, it aims to squeeze more information into a small number of parameters. Extensive experiment results are provided to demonstrate the power of the proposed recurrent parameter generator.

READ FULL TEXT
research
04/25/2020

Depthwise Separable Convolutional ResNet with Squeeze-and-Excitation Blocks for Small-footprint Keyword Spotting

One difficult problem of keyword spotting is how to miniaturize its memo...
research
02/08/2019

FSNet: Compression of Deep Convolutional Neural Networks by Filter Summary

We present a novel method of compression of deep Convolutional Neural Ne...
research
08/08/2020

Using PSPNet and UNet to analyze the internal parameter relationship and visualization of the convolutional neural network

Convolutional neural network(CNN) has achieved great success in many fie...
research
10/17/2022

Approximating Continuous Convolutions for Deep Network Compression

We present ApproxConv, a novel method for compressing the layers of a co...
research
05/27/2023

A Hybrid Quantum-Classical Approach based on the Hadamard Transform for the Convolutional Layer

In this paper, we propose a novel Hadamard Transform (HT)-based neural n...
research
02/26/2019

Learning Implicitly Recurrent CNNs Through Parameter Sharing

We introduce a parameter sharing scheme, in which different layers of a ...
research
01/15/2018

Deep Net Triage: Assessing the Criticality of Network Layers by Structural Compression

Deep network compression seeks to reduce the number of parameters in the...

Please sign up or login with your details

Forgot password? Click here to reset