`local' vs. `global' parameters -- breaking the gaussian complexity barrier

04/09/2015
by   Shahar Mendelson, et al.
0

We show that if F is a convex class of functions that is L-subgaussian, the error rate of learning problems generated by independent noise is equivalent to a fixed point determined by `local' covering estimates of the class, rather than by the gaussian averages. To that end, we establish new sharp upper and lower estimates on the error rate for such problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset