MixUp as Locally Linear Out-Of-Manifold Regularization

09/07/2018
by   Hongyu Guo, et al.
0

MixUp, a data augmentation approach through mixing random samples, has been shown to be able to significantly improve the predictive accuracy of the current art of deep neural networks. The power of MixUp, however, is mostly established empirically and its working and effectiveness have not been explained in any depth. In this paper, we develop a theoretical understanding for MixUp as a form of out-of-manifold regularization, which constrains the model on the input space beyond the data manifold. This analytical study also enables us to identify MixUp's limitation caused by manifold intrusion, where synthetic samples collide with real examples of the manifold. Such intrusion gives rise to over regularization and thereby under-fitting. To address this issue, we further propose a novel regularizer, where mixing policies are adaptively learned from the data and a manifold intrusion loss is embraced as to avoid collision with the data manifold. We empirically show, using several benchmark datasets, our regularizer's effectiveness in terms of over regularization avoiding and accuracy improvement upon current art of deep classification models and MixUp.

READ FULL TEXT

page 1

page 6

research
10/18/2021

Intrusion-Free Graph Mixup

We present a simple and yet effective interpolation-based regularization...
research
09/15/2021

Adversarial Mixing Policy for Relaxing Locally Linear Constraints in Mixup

Mixup is a recent regularizer for current deep classification networks. ...
research
01/12/2022

Preventing Manifold Intrusion with Locality: Local Mixup

Mixup is a data-dependent regularization technique that consists in line...
research
06/14/2020

PatchUp: A Regularization Technique for Convolutional Neural Networks

Large capacity deep learning models are often prone to a high generaliza...
research
06/13/2019

A Meta Approach to Defend Noisy Labels by the Manifold Regularizer PSDR

Noisy labels are ubiquitous in real-world datasets, which poses a challe...
research
04/29/2018

SHARE: Regularization for Deep Learning

Regularization is a big issue for training deep neural networks. In this...
research
07/24/2023

Entropy Transformer Networks: A Learning Approach via Tangent Bundle Data Manifold

This paper focuses on an accurate and fast interpolation approach for im...

Please sign up or login with your details

Forgot password? Click here to reset