Gaussian Process Regression in the Flat Limit

01/04/2022
by   Simon Barthelmé, et al.
0

Gaussian process (GP) regression is a fundamental tool in Bayesian statistics. It is also known as kriging and is the Bayesian counterpart to the frequentist kernel ridge regression. Most of the theoretical work on GP regression has focused on a large-n asymptotics, characterising the behaviour of GP regression as the amount of data increases. Fixed-sample analysis is much more difficult outside of simple cases, such as locations on a regular grid. In this work we perform a fixed-sample analysis that was first studied in the context of approximation theory by Driscoll Fornberg (2002), called the "flat limit". In flat-limit asymptotics, the goal is to characterise kernel methods as the length-scale of the kernel function tends to infinity, so that kernels appear flat over the range of the data. Surprisingly, this limit is well-defined, and displays interesting behaviour: Driscoll Fornberg showed that radial basis interpolation converges in the flat limit to polynomial interpolation, if the kernel is Gaussian. Leveraging recent results on the spectral behaviour of kernel matrices in the flat limit, we study the flat limit of Gaussian process regression. Results show that Gaussian process regression tends in the flat limit to (multivariate) polynomial regression, or (polyharmonic) spline regression, depending on the kernel. Importantly, this holds for both the predictive mean and the predictive variance, so that the posterior predictive distributions become equivalent. Our results have practical consequences: for instance, they show that optimal GP predictions in the sense of leave-one-out loss may occur at very large length-scales, which would be invisible to current implementations because of numerical difficulties.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset