Efficient Conditional Pre-training for Transfer Learning

11/20/2020
by   Shuvam Chakraborty, et al.
6

Almost all the state-of-the-art neural networks for computer vision tasks are trained by (1) Pre-training on a large scale dataset and (2) finetuning on the target dataset. This strategy helps reduce the dependency on the target dataset and improves convergence rate and generalization on the target task. Although pre-training on large scale datasets is very useful, its foremost disadvantage is high training cost. To address this, we propose efficient target dataset conditioned filtering methods to remove less relevant samples from the pre-training dataset. Unlike prior work, we focus on efficiency, adaptability, and flexibility in addition to performance. Additionally, we discover that lowering image resolutions in the pre-training step offers a great trade-off between cost and performance. We validate our techniques by pre-training on ImageNet in both the unsupervised and supervised settings and finetuning on a diverse collection of target datasets and tasks. Our proposed methods drastically reduce pre-training cost and provide strong performance boosts.

READ FULL TEXT
research
12/06/2019

ClusterFit: Improving Generalization of Visual Representations

Pre-training convolutional neural networks with weakly-supervised and se...
research
12/20/2014

An Analysis of Unsupervised Pre-training in Light of Recent Advances

Convolutional neural networks perform well on object recognition because...
research
02/06/2021

SM+: Refined Scale Match for Tiny Person Detection

Detecting tiny objects ( e.g., less than 20 x 20 pixels) in large-scale ...
research
12/24/2019

Large Scale Learning of General Visual Representations for Transfer

Transfer of pre-trained representations improves sample efficiency and s...
research
06/21/2021

Does Optimal Source Task Performance Imply Optimal Pre-training for a Target Task?

Pre-trained deep nets are commonly used to improve accuracies and traini...
research
09/30/2022

Where Should I Spend My FLOPS? Efficiency Evaluations of Visual Pre-training Methods

Self-supervised methods have achieved remarkable success in transfer lea...
research
05/02/2023

Discovering the Effectiveness of Pre-Training in a Large-scale Car-sharing Platform

Recent progress of deep learning has empowered various intelligent trans...

Please sign up or login with your details

Forgot password? Click here to reset