Nyström Subspace Learning for Large-scale SVMs

02/20/2020
by   Weida Li, et al.
0

As an implementation of the Nyström method, Nyström computational regularization (NCR) imposed on kernel classification and kernel ridge regression has proven capable of achieving optimal bounds in the large-scale statistical learning setting, while enjoying much better time complexity. In this study, we propose a Nyström subspace learning (NSL) framework to reveal that all you need for employing the Nyström method, including NCR, upon any kernel SVM is to use the efficient off-the-shelf linear SVM solvers as a black box. Based on our analysis, the bounds developed for the Nyström method are linked to NSL, and the analytical difference between two distinct implementations of the Nyström method is clearly presented. Besides, NSL also leads to sharper theoretical results for the clustered Nyström method. Finally, both regression and classification tasks are performed to compare two implementations of the Nyström method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset