Subspace-Induced Gaussian Processes

02/21/2018
by   Zilong Tan, et al.
0

We present a new Gaussian process (GP) regression model where the covariance kernel is indexed or parameterized by a sufficient dimension reduction subspace of a reproducing kernel Hilbert space. The covariance kernel will be low-rank while capturing the statistical dependency of the response to the covariates, this affords significant improvement in computational efficiency as well as potential reduction in the variance of predictions. We develop a fast Expectation-Maximization algorithm for estimating the parameters of the subspace-induced Gaussian process (SIGP). Extensive results on real data show that SIGP can outperform the standard full GP even with a low rank-m, m≤ 3, inducing subspace.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset