Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression

04/07/2021
by   Xin Ding, et al.
7

Knowledge distillation (KD) has been actively studied for image classification tasks in deep learning, aiming to improve the performance of a student model based on the knowledge from a teacher model. However, there have been very few efforts for applying KD in image regression with a scalar response, and there is no KD method applicable to both tasks. Moreover, existing KD methods often require a practitioner to carefully choose or adjust the teacher and student architectures, making these methods less scalable in practice. Furthermore, although KD is usually conducted in scenarios with limited labeled data, very few techniques are developed to alleviate such data insufficiency. To solve the above problems in an all-in-one manner, we propose in this paper a unified KD framework based on conditional generative adversarial networks (cGANs), termed cGAN-KD. Fundamentally different from existing KD methods, cGAN-KD distills and transfers knowledge from a teacher model to a student model via cGAN-generated samples. This unique mechanism makes cGAN-KD suitable for both classification and regression tasks, compatible with other KD methods, and insensitive to the teacher and student architectures. Also, benefiting from the recent advances in cGAN methodology and our specially designed subsampling and filtering procedures, cGAN-KD also performs well when labeled data are scarce. An error bound of a student model trained in the cGAN-KD framework is derived in this work, which theoretically explains why cGAN-KD takes effect and guides the implementation of cGAN-KD in practice. Extensive experiments on CIFAR-10 and Tiny-ImageNet show that we can incorporate state-of-the-art KD methods into the cGAN-KD framework to reach a new state of the art. Also, experiments on RC-49 and UTKFace demonstrate the effectiveness of cGAN-KD in image regression tasks, where existing KD methods are inapplicable.

READ FULL TEXT

page 9

page 12

research
06/30/2020

Extracurricular Learning: Knowledge Transfer Beyond Empirical Distribution

Knowledge distillation has been used to transfer knowledge learned by a ...
research
05/22/2021

Revisiting Knowledge Distillation for Object Detection

The existing solutions for object detection distillation rely on the ava...
research
08/06/2020

MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models

Deep neural network based image classification methods usually require a...
research
07/05/2022

ST-CoNAL: Consistency-Based Acquisition Criterion Using Temporal Self-Ensemble for Active Learning

Modern deep learning has achieved great success in various fields. Howev...
research
07/22/2020

Leveraging Undiagnosed Data for Glaucoma Classification with Teacher-Student Learning

Recently, deep learning has been adopted to the glaucoma classification ...
research
01/17/2022

Distillation from heterogeneous unlabeled collections

Compressing deep networks is essential to expand their range of applicat...
research
07/27/2018

One-Shot Optimal Topology Generation through Theory-Driven Machine Learning

We introduce a theory-driven mechanism for learning a neural network mod...

Please sign up or login with your details

Forgot password? Click here to reset