SVM Learning Rates for Data with Low Intrinsic Dimension

03/13/2020
by   Thomas Hamm, et al.
0

We derive improved regression and classification rates for support vector machines using Gaussian kernels under the assumption that the data has some low-dimensional intrinsic structure. Our notion of intrinsic dimension is defined by the box-counting dimension of the support of the data generating distribution. Under some standard regularity assumptions for regression and classification we prove learning rates, where we essentially substitute in some well-established learning rates the dimension of the ambient space with the box-counting dimension of the support of the data generating distribution. Furthermore, we show that a training validation approach for choosing the hyperparameters of an SVM in a data dependent way achieves the same rates adaptively, i.e. without knowledge on the data generating distribution, in particular without knowing the box-counting dimension of the support of the data generating distribution.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset