NTKCPL: Active Learning on Top of Self-Supervised Model by Estimating True Coverage

by   Ziting Wen, et al.

High annotation cost for training machine learning classifiers has driven extensive research in active learning and self-supervised learning. Recent research has shown that in the context of supervised learning different active learning strategies need to be applied at various stages of the training process to ensure improved performance over the random baseline. We refer to the point where the number of available annotations changes the suitable active learning strategy as the phase transition point. In this paper, we establish that when combining active learning with self-supervised models to achieve improved performance, the phase transition point occurs earlier. It becomes challenging to determine which strategy should be used for previously unseen datasets. We argue that existing active learning algorithms are heavily influenced by the phase transition because the empirical risk over the entire active learning pool estimated by these algorithms is inaccurate and influenced by the number of labeled samples. To address this issue, we propose a novel active learning strategy, neural tangent kernel clustering-pseudo-labels (NTKCPL). It estimates empirical risk based on pseudo-labels and the model prediction with NTK approximation. We analyze the factors affecting this approximation error and design a pseudo-label clustering generation method to reduce the approximation error. We validate our method on five datasets, empirically demonstrating that it outperforms the baseline methods in most cases and is valid over a wider range of training budgets.


Reducing Label Effort: Self-Supervised meets Active Learning

Active learning is a paradigm aimed at reducing the annotation effort by...

Deep Active Learning Using Barlow Twins

The generalisation performance of a convolutional neural networks (CNN) ...

Self-supervised Assisted Active Learning for Skin Lesion Segmentation

Label scarcity has been a long-standing issue for biomedical image segme...

On the Marginal Benefit of Active Learning: Does Self-Supervision Eat Its Cake?

Active learning is the set of techniques for intelligently labeling larg...

Coincidence, Categorization, and Consolidation: Learning to Recognize Sounds with Minimal Supervision

Humans do not acquire perceptual abilities in the way we train machines....

AstronomicAL: An interactive dashboard for visualisation, integration and classification of data using Active Learning

AstronomicAL is a human-in-the-loop interactive labelling and training d...

Please sign up or login with your details

Forgot password? Click here to reset