Relative Deviation Margin Bounds

06/26/2020
by   Corinna Cortes, et al.
7

We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, both data-dependent ones and bounds valid for general families, in terms of the Rademacher complexity or the empirical ℓ_∞ covering number of the hypothesis set used. We also briefly highlight several applications of these bounds and discuss their connection with existing results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset