Relative Deviation Margin Bounds
We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, both data-dependent ones and bounds valid for general families, in terms of the Rademacher complexity or the empirical ℓ_∞ covering number of the hypothesis set used. We also briefly highlight several applications of these bounds and discuss their connection with existing results.
READ FULL TEXT