A vector-contraction inequality for Rademacher complexities

05/01/2016
by   Andreas Maurer, et al.
0

The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for multi-category learning, K-means clustering and learning-to-learn.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset