Model Compression with Generative Adversarial Networks

12/05/2018
by   Ruishan Liu, et al.
0

More accurate machine learning models often demand more computation and memory at test time, making them difficult to deploy on CPU- or memory-constrained devices. Model compression (also known as distillation) alleviates this burden by training a less expensive student model to mimic the expensive teacher model while maintaining most of the original accuracy. However, when fresh data is unavailable for the compression task, the teacher's training data is typically reused, leading to suboptimal compression. In this work, we propose to augment the compression dataset with synthetic data from a generative adversarial network (GAN) designed to approximate the training data distribution. Our GAN-assisted model compression (GAN-MC) significantly improves student accuracy for expensive models such as deep neural networks and large random forests on both image and tabular datasets. Building on these results, we propose a comprehensive metric---the Compression Score---to evaluate the quality of synthetic datasets based on their induced model compression performance. The Compression Score captures both data diversity and discriminability, and we illustrate its benefits over the popular Inception Score in the context of image classification.

READ FULL TEXT
research
02/01/2019

Compressing GANs using Knowledge Distillation

Generative Adversarial Networks (GANs) have been used in several machine...
research
03/28/2018

Adversarial Network Compression

Neural network compression has recently received much attention due to t...
research
03/28/2023

Information-Theoretic GAN Compression with Variational Energy-based Model

We propose an information-theoretic knowledge distillation approach for ...
research
12/07/2020

Model Compression Using Optimal Transport

Model compression methods are important to allow for easier deployment o...
research
12/29/2022

Discriminator-Cooperated Feature Map Distillation for GAN Compression

Despite excellent performance in image generation, Generative Adversaria...
research
08/16/2021

Online Multi-Granularity Distillation for GAN Compression

Generative Adversarial Networks (GANs) have witnessed prevailing success...
research
05/28/2018

Deep Generative Models for Distribution-Preserving Lossy Compression

We propose and study the problem of distribution-preserving lossy compre...

Please sign up or login with your details

Forgot password? Click here to reset