Learning Sparse Ternary Neural Networks with Entropy-Constrained Trained Ternarization (EC2T)

04/02/2020
by   Arturo Marban, et al.
0

Deep neural networks (DNN) have shown remarkable success in a variety of machine learning applications. The capacity of these models (i.e., number of parameters), endows them with expressive power and allows them to reach the desired performance. In recent years, there is an increasing interest in deploying DNNs to resource-constrained devices (i.e., mobile devices) with limited energy, memory, and computational budget. To address this problem, we propose Entropy-Constrained Trained Ternarization (EC2T), a general framework to create sparse and ternary neural networks which are efficient in terms of storage (e.g., at most two binary-masks and two full-precision values are required to save a weight matrix) and computation (e.g., MAC operations are reduced to a few accumulations plus two multiplications). This approach consists of two steps. First, a super-network is created by scaling the dimensions of a pre-trained model (i.e., its width and depth). Subsequently, this super-network is simultaneously pruned (using an entropy constraint) and quantized (that is, ternary values are assigned layer-wise) in a training process, resulting in a sparse and ternary network representation. We validate the proposed approach in CIFAR-10, CIFAR-100, and ImageNet datasets, showing its effectiveness in image classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2021

ECQ^x: Explainability-Driven Quantization for Low-Bit and Sparse DNNs

The remarkable success of deep neural networks (DNNs) in various applica...
research
05/31/2019

Multi-Precision Quantized Neural Networks via Encoding Decomposition of -1 and +1

The training of deep neural networks (DNNs) requires intensive resources...
research
06/17/2021

Layer Folding: Neural Network Depth Reduction using Activation Linearization

Despite the increasing prevalence of deep neural networks, their applica...
research
06/23/2023

Binary domain generalization for sparsifying binary neural networks

Binary neural networks (BNNs) are an attractive solution for developing ...
research
04/15/2018

SparseNet: A Sparse DenseNet for Image Classification

Deep neural networks have made remarkable progresses on various computer...
research
10/21/2017

Incomplete Dot Products for Dynamic Computation Scaling in Neural Network Inference

We propose the use of incomplete dot products (IDP) to dynamically adjus...
research
05/09/2018

Evaluating ResNeXt Model Architecture for Image Classification

In recent years, deep learning methods have been successfully applied to...

Please sign up or login with your details

Forgot password? Click here to reset