Robust Sampling in Deep Learning

06/04/2020
by   Aurora Cobo Aguilera, et al.
0

Deep learning requires regularization mechanisms to reduce overfitting and improve generalization. We address this problem by a new regularization method based on distributional robust optimization. The key idea is to modify the contribution from each sample for tightening the empirical risk bound. During the stochastic training, the selection of samples is done according to their accuracy in such a way that the worst performed samples are the ones that contribute the most in the optimization. We study different scenarios and show the ones where it can make the convergence faster or increase the accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2020

Risk Variance Penalization: From Distributional Robustness to Causality

Learning under multi-environments often requires the ability of out-of-d...
research
03/02/2019

Time-Delay Momentum: A Regularization Perspective on the Convergence and Generalization of Stochastic Momentum for Deep Learning

In this paper we study the problem of convergence and generalization err...
research
07/22/2018

PaloBoost: An Overfitting-robust TreeBoost with Out-of-Bag Sample Regularization Techniques

Stochastic Gradient TreeBoost is often found in many winning solutions i...
research
12/20/2022

Distributional Robustness Bounds Generalization Errors

Bayesian methods, distributionally robust optimization methods, and regu...
research
10/14/2017

Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

Overfitting is one of the most critical challenges in deep neural networ...
research
01/21/2020

batchboost: regularization for stabilizing training with resistance to underfitting overfitting

Overfitting underfitting and stable training are an important challe...
research
04/16/2019

GradMask: Reduce Overfitting by Regularizing Saliency

With too few samples or too many model parameters, overfitting can inhib...

Please sign up or login with your details

Forgot password? Click here to reset