Relative Deviation Margin Bounds

06/26/2020
by   Corinna Cortes, et al.
7

We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, both data-dependent ones and bounds valid for general families, in terms of the Rademacher complexity or the empirical ℓ_∞ covering number of the hypothesis set used. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2022

ℋ-Consistency Estimation Error of Surrogate Loss Minimizers

We present a detailed study of estimation errors in terms of surrogate l...
research
05/20/2016

Structured Prediction Theory Based on Factor Graph Complexity

We present a general theoretical analysis of structured prediction with ...
research
01/19/2007

Algorithmic Complexity Bounds on Future Prediction Errors

We bound the future loss when predicting any (computably) stochastic seq...
research
10/11/2018

Classification using margin pursuit

In this work, we study a new approach to optimizing the margin distribut...
research
07/21/2020

On the Rademacher Complexity of Linear Hypothesis Sets

Linear predictors form a rich class of hypotheses used in a variety of l...
research
04/21/2022

Differentially Private Learning with Margin Guarantees

We present a series of new differentially private (DP) algorithms with d...
research
12/04/2020

Margin of Victory in Tournaments: Structural and Experimental Results

Tournament solutions are standard tools for identifying winners based on...

Please sign up or login with your details

Forgot password? Click here to reset