Adaptive Consistency Regularization for Semi-Supervised Transfer Learning

03/03/2021
by   Abulikemu Abuduweili, et al.
0

While recent studies on semi-supervised learning have shown remarkable progress in leveraging both labeled and unlabeled data, most of them presume a basic setting of the model is randomly initialized. In this work, we consider semi-supervised learning and transfer learning jointly, leading to a more practical and competitive paradigm that can utilize both powerful pre-trained models from source domain as well as labeled/unlabeled data in the target domain. To better exploit the value of both pre-trained weights and unlabeled target examples, we introduce adaptive consistency regularization that consists of two complementary components: Adaptive Knowledge Consistency (AKC) on the examples between the source and target model, and Adaptive Representation Consistency (ARC) on the target model between labeled and unlabeled examples. Examples involved in the consistency regularization are adaptively selected according to their potential contributions to the target task. We conduct extensive experiments on several popular benchmarks including CUB-200-2011, MIT Indoor-67, MURA, by fine-tuning the ImageNet pre-trained ResNet-50 model. Results show that our proposed adaptive consistency regularization outperforms state-of-the-art semi-supervised learning techniques such as Pseudo Label, Mean Teacher, and MixMatch. Moreover, our algorithm is orthogonal to existing methods and thus able to gain additional improvements on top of MixMatch and FixMatch. Our code is available at https://github.com/SHI-Labs/Semi-Supervised-Transfer-Learning.

READ FULL TEXT
research
12/13/2018

When Semi-Supervised Learning Meets Transfer Learning: Training Strategies, Models and Datasets

Semi-Supervised Learning (SSL) has been proved to be an effective way to...
research
03/20/2023

Boosting Semi-Supervised Learning by Exploiting All Unlabeled Data

Semi-supervised learning (SSL) has attracted enormous attention due to i...
research
03/23/2020

Meta Pseudo Labels

Many training algorithms of a deep neural network can be interpreted as ...
research
08/16/2019

Pseudo-task Regularization for ConvNet Transfer Learning

This paper is about regularizing deep convolutional networks (ConvNets) ...
research
06/01/2021

Semi-Supervised Domain Generalization with Stochastic StyleMatch

Most existing research on domain generalization assumes source data gath...
research
11/23/2020

AlphaMatch: Improving Consistency for Semi-supervised Learning with Alpha-divergence

Semi-supervised learning (SSL) is a key approach toward more data-effici...
research
09/16/2021

Semi-Supervised Visual Representation Learning for Fashion Compatibility

We consider the problem of complementary fashion prediction. Existing ap...

Please sign up or login with your details

Forgot password? Click here to reset