Large-Scale Generative Data-Free Distillation

12/10/2020
by   Liangchen Luo, et al.
4

Knowledge distillation is one of the most popular and effective techniques for knowledge transfer, model compression and semi-supervised learning. Most existing distillation approaches require the access to original or augmented training samples. But this can be problematic in practice due to privacy, proprietary and availability concerns. Recent work has put forward some methods to tackle this problem, but they are either highly time-consuming or unable to scale to large datasets. To this end, we propose a new method to train a generative image model by leveraging the intrinsic normalization layers' statistics of the trained teacher network. This enables us to build an ensemble of generators without training data that can efficiently produce substitute inputs for subsequent distillation. The proposed method pushes forward the data-free distillation performance on CIFAR-10 and CIFAR-100 to 95.02 77.02 which to the best of our knowledge, has never been done using generative models in a data-free setting.

READ FULL TEXT

page 5

page 6

page 7

research
04/12/2021

Dual Discriminator Adversarial Distillation for Data-free Model Compression

Knowledge distillation has been widely used to produce portable and effi...
research
12/31/2021

Conditional Generative Data-Free Knowledge Distillation based on Attention Transfer

Knowledge distillation has made remarkable achievements in model compres...
research
03/08/2023

DiM: Distilling Dataset into Generative Model

Dataset distillation reduces the network training cost by synthesizing s...
research
03/21/2023

Model Robustness Meets Data Privacy: Adversarial Robustness Distillation without Original Data

Large-scale deep learning models have achieved great performance based o...
research
05/17/2019

Dream Distillation: A Data-Independent Model Compression Framework

Model compression is eminently suited for deploying deep learning on IoT...
research
12/12/2021

Up to 100x Faster Data-free Knowledge Distillation

Data-free knowledge distillation (DFKD) has recently been attracting inc...
research
03/03/2023

Unsupervised Deep Digital Staining For Microscopic Cell Images Via Knowledge Distillation

Staining is critical to cell imaging and medical diagnosis, which is exp...

Please sign up or login with your details

Forgot password? Click here to reset