BatMan-CLR: Making Few-shots Meta-Learners Resilient Against Label Noise

09/12/2023
by   Jeroen M. Galjaard, et al.
0

The negative impact of label noise is well studied in classical supervised learning yet remains an open research question in meta-learning. Meta-learners aim to adapt to unseen learning tasks by learning a good initial model in meta-training and consecutively fine-tuning it according to new tasks during meta-testing. In this paper, we present the first extensive analysis of the impact of varying levels of label noise on the performance of state-of-the-art meta-learners, specifically gradient-based N-way K-shot learners. We show that the accuracy of Reptile, iMAML, and foMAML drops by up to 42 Omniglot and CifarFS datasets when meta-training is affected by label noise. To strengthen the resilience against label noise, we propose two sampling techniques, namely manifold (Man) and batch manifold (BatMan), which transform the noisy supervised learners into semi-supervised ones to increase the utility of noisy labels. We first construct manifold samples of N-way 2-contrastive-shot tasks through augmentation, learning the embedding via a contrastive loss in meta-training, and then perform classification through zeroing on the embedding in meta-testing. We show that our approach can effectively mitigate the impact of meta-training label noise. Even with 60 wrong labels and can limit the meta-testing accuracy drop to 2.5, 9.4, 1.1 percent points, respectively, with existing meta-learners across the Omniglot, CifarFS, and MiniImagenet datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2019

MxML: Mixture of Meta-Learners for Few-Shot Classification

A meta-model is trained on a distribution of similar tasks such that it ...
research
10/08/2019

When Does Self-supervision Improve Few-shot Learning?

We present a technique to improve the generalization of deep representat...
research
08/09/2021

The Role of Global Labels in Few-Shot Classification and How to Infer Them

Few-shot learning (FSL) is a central problem in meta-learning, where lea...
research
08/26/2021

MCML: A Novel Memory-based Contrastive Meta-Learning Method for Few Shot Slot Tagging

Meta-learning is widely used for few-shot slot tagging in the task of fe...
research
06/04/2022

Robust Meta-learning with Sampling Noise and Label Noise via Eigen-Reptile

Recent years have seen a surge of interest in meta-learning techniques f...
research
01/30/2022

Similarity and Generalization: From Noise to Corruption

Contrastive learning aims to extract distinctive features from data by f...
research
06/09/2021

Attentional meta-learners are polythetic classifiers

Polythetic classifications, based on shared patterns of features that ne...

Please sign up or login with your details

Forgot password? Click here to reset