Noise-Robust Bidirectional Learning with Dynamic Sample Reweighting

09/03/2022
by   Chen-Chen Zong, et al.
0

Deep neural networks trained with standard cross-entropy loss are more prone to memorize noisy labels, which degrades their performance. Negative learning using complementary labels is more robust when noisy labels intervene but with an extremely slow model convergence speed. In this paper, we first introduce a bidirectional learning scheme, where positive learning ensures convergence speed while negative learning robustly copes with label noise. Further, a dynamic sample reweighting strategy is proposed to globally weaken the effect of noise-labeled samples by exploiting the excellent discriminatory ability of negative learning on the sample probability distribution. In addition, we combine self-distillation to further improve the model performance. The code is available at <https://github.com/chenchenzong/BLDR>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/08/2020

Multi-Objective Interpolation Training for Robustness to Label Noise

Deep neural networks trained with standard cross-entropy loss memorize n...
research
03/13/2023

Twin Contrastive Learning with Noisy Labels

Learning from noisy data is a challenging task that significantly degene...
research
05/02/2022

SELC: Self-Ensemble Label Correction Improves Learning with Noisy Labels

Deep neural networks are prone to overfitting noisy labels, resulting in...
research
07/31/2021

Learning with Noisy Labels via Sparse Regularization

Learning with noisy labels is an important and challenging task for trai...
research
11/21/2022

Blind Knowledge Distillation for Robust Image Classification

Optimizing neural networks with noisy labels is a challenging task, espe...
research
06/30/2022

ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State

To train robust deep neural networks (DNNs), we systematically study sev...
research
12/02/2022

Avoiding spurious correlations via logit correction

Empirical studies suggest that machine learning models trained with empi...

Please sign up or login with your details

Forgot password? Click here to reset