A maximum principle argument for the uniform convergence of graph Laplacian regressors
We study asymptotic consistency guarantees for a non-parametric regression problem with Laplacian regularization. In particular, we consider (x_1, y_1), ..., (x_n, y_n) samples from some distribution on the cross product M×R, where M is a m-dimensional manifold embedded in R^d. A geometric graph on the cloud {x_1, ..., x_n } is constructed by connecting points that are within some specified distance ε_n. A suitable semi-linear equation involving the resulting graph Laplacian is used to obtain a regressor for the observed values of y. We establish probabilistic error rates for the uniform difference between the regressor constructed from the observed data and the Bayes regressor (or trend) associated to the ground-truth distribution. We give the explicit dependence of the rates in terms of the parameter ε_n, the strength of regularization β_n, and the number of data points n. Our argument relies on a simple, yet powerful, maximum principle for the graph Laplacian. We also address a simple extension of the framework to a semi-supervised setting.
READ FULL TEXT