Latent Class-Conditional Noise Model

by   Jiangchao Yao, et al.

Learning with noisy labels has become imperative in the Big Data era, which saves expensive human labors on accurate annotations. Previous noise-transition-based methods have achieved theoretically-grounded performance under the Class-Conditional Noise model (CCN). However, these approaches builds upon an ideal but impractical anchor set available to pre-estimate the noise transition. Even though subsequent works adapt the estimation as a neural layer, the ill-posed stochastic learning of its parameters in back-propagation easily falls into undesired local minimums. We solve this problem by introducing a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework. By projecting the noise transition into the Dirichlet space, the learning is constrained on a simplex characterized by the complete dataset, instead of some ad-hoc parametric space wrapped by the neural layer. We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels to train the classifier and to model the noise. Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples. We further generalize LCCN to different counterparts compatible with open-set noisy labels, semi-supervised learning as well as cross-model training. A range of experiments demonstrate the advantages of LCCN and its variants over the current state-of-the-art methods.


page 2

page 11

page 13

page 19


Safeguarded Dynamic Label Regression for Generalized Noisy Supervision

Learning with noisy labels, which aims to reduce expensive labors on acc...

Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization

Many weakly supervised classification methods employ a noise transition ...

Meta Transition Adaptation for Robust Deep Learning with Noisy Labels

To discover intrinsic inter-class transition probabilities underlying da...

Dual T: Reducing Estimation Error for Transition Matrix in Label-noise Learning

The transition matrix, denoting the transition relationship from clean l...

From Noisy Prediction to True Label: Noisy Prediction Calibration via Generative Model

Noisy labels are inevitable yet problematic in machine learning society....

Two-Phase Learning for Overcoming Noisy Labels

To counter the challenge associated with noise labels, the learning stra...

Establishment of Neural Networks Robust to Label Noise

Label noise is a significant obstacle in deep learning model training. I...

Please sign up or login with your details

Forgot password? Click here to reset