Fast increased fidelity approximate Gibbs samplers for Bayesian Gaussian process regression

06/11/2020
by   Kelly R. Moran, et al.
0

The use of Gaussian processes (GPs) is supported by efficient sampling algorithms, a rich methodological literature, and strong theoretical grounding. However, due to their prohibitive computation and storage demands, the use of exact GPs in Bayesian models is limited to problems containing at most several thousand observations. Sampling requires matrix operations that scale at 𝒪(n^3), where n is the number of unique inputs. Storage of individual matrices scales at 𝒪(n^2), and can quickly overwhelm the resources of most modern computers. To overcome these bottlenecks, we develop a sampling algorithm using ℋ matrix approximation of the matrices comprising the GP posterior covariance. These matrices can approximate the true conditional covariance matrix within machine precision and allow for sampling algorithms that scale at 𝒪(n ^2 n) time and storage demands scaling at 𝒪(n n). We also describe how these algorithms can be used as building blocks to model higher dimensional surfaces at 𝒪(d n ^2 n), where d is the dimension of the surface under consideration, using tensor products of one-dimensional GPs. Though various scalable processes have been proposed for approximating Bayesian GP inference when n is large, to our knowledge, none of these methods show that the approximation's Kullback-Leibler divergence to the true posterior can be made arbitrarily small and may be no worse than the approximation provided by finite computer arithmetic. We describe ℋ-matrices, give an efficient Gibbs sampler using these matrices for one-dimensional GPs, offer a proposed extension to higher dimensional surfaces, and investigate the performance of this fast increased fidelity approximate GP, FIFA-GP, using both simulated and real data sets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset