Similarity of Classification Tasks

by   Cuong Nguyen, et al.

Recent advances in meta-learning has led to remarkable performances on several few-shot learning benchmarks. However, such success often ignores the similarity between training and testing tasks, resulting in a potential bias evaluation. We, therefore, propose a generative approach based on a variant of Latent Dirichlet Allocation to analyse task similarity to optimise and better understand the performance of meta-learning. We demonstrate that the proposed method can provide an insightful evaluation for meta-learning algorithms on two few-shot classification benchmarks that matches common intuition: the more similar the higher performance. Based on this similarity measure, we propose a task-selection strategy for meta-learning and show that it can produce more accurate classification results than methods that randomly select training tasks.


page 21

page 27

page 28


Putting Theory to Work: From Learning Bounds to Meta-Learning Algorithms

Most of existing deep learning models rely on excessive amounts of label...

Task Weighting in Meta-learning with Trajectory Optimisation

Developing meta-learning algorithms that are un-biased toward a subset o...

The Devil is in the Details: On Models and Training Regimes for Few-Shot Intent Classification

Few-shot Intent Classification (FSIC) is one of the key challenges in mo...

Lessons from Chasing Few-Shot Learning Benchmarks: Rethinking the Evaluation of Meta-Learning Methods

In this work we introduce a simple baseline for meta-learning. Our uncon...

Adaptive Task Sampling for Meta-Learning

Meta-learning methods have been extensively studied and applied in compu...

Coarse-to-Fine Pseudo-Labeling Guided Meta-Learning for Few-Shot Classification

To endow neural networks with the potential to learn rapidly from a hand...

Learning to Initialize: Can Meta Learning Improve Cross-task Generalization in Prompt Tuning?

Prompt tuning (PT) which only tunes the embeddings of an additional sequ...

Please sign up or login with your details

Forgot password? Click here to reset