Task Attended Meta-Learning for Few-Shot Learning

by   Aroof Aimen, et al.

Meta-learning (ML) has emerged as a promising direction in learning models under constrained resource settings like few-shot learning. The popular approaches for ML either learn a generalizable initial model or a generic parametric optimizer through episodic training. The former approaches leverage the knowledge from a batch of tasks to learn an optimal prior. In this work, we study the importance of a batch for ML. Specifically, we first incorporate a batch episodic training regimen to improve the learning of the generic parametric optimizer. We also hypothesize that the common assumption in batch episodic training that each task in a batch has an equal contribution to learning an optimal meta-model need not be true. We propose to weight the tasks in a batch according to their "importance" in improving the meta-model's learning. To this end, we introduce a training curriculum motivated by selective focus in humans, called task attended meta-training, to weight the tasks in a batch. Task attention is a standalone module that can be integrated with any batch episodic training regimen. The comparisons of the models with their non-task-attended counterparts on complex datasets like miniImageNet and tieredImageNet validate its effectiveness.


page 1

page 2

page 3

page 4


Stress Testing of Meta-learning Approaches for Few-shot Learning

Meta-learning (ML) has emerged as a promising learning method under reso...

Meta-Transfer Learning through Hard Tasks

Meta-learning has been proposed as a framework to address the challengin...

Is the Meta-Learning Idea Able to Improve the Generalization of Deep Neural Networks on the Standard Supervised Learning?

Substantial efforts have been made on improving the generalization abili...

Task Weighting in Meta-learning with Trajectory Optimisation

Developing meta-learning algorithms that are un-biased toward a subset o...

MetaModulation: Learning Variational Feature Hierarchies for Few-Shot Learning with Fewer Tasks

Meta-learning algorithms are able to learn a new task using previously l...

Constrained Few-Shot Learning: Human-Like Low Sample Complexity Learning and Non-Episodic Text Classification

Few-shot learning (FSL) is an emergent paradigm of learning that attempt...

Incremental Learning-to-Learn with Statistical Guarantees

In learning-to-learn the goal is to infer a learning algorithm that work...

Please sign up or login with your details

Forgot password? Click here to reset