Analysis of regularized Nyström subsampling for regression functions of low smoothness
This paper studies a Nyström type subsampling approach to large kernel learning methods in the misspecified case, where the target function is not assumed to belong to the reproducing kernel Hilbert space generated by the underlying kernel. This case is less understood, in spite of its practical importance. To model such a case, the smoothness of target functions is described in terms of general source conditions. It is surprising that almost for the whole range of the source conditions, describing the misspecified case, the corresponding learning rate bounds can be achieved with just one value of the regularization parameter. This observation allows a formulation of mild conditions under which the plain Nyström subsampling can be realized with subquadratic cost maintaining the guaranteed learning rates.
READ FULL TEXT