Input Dependent Sparse Gaussian Processes

by   Bahram Jafrasteh, et al.

Gaussian Processes (GPs) are Bayesian models that provide uncertainty estimates associated to the predictions made. They are also very flexible due to their non-parametric nature. Nevertheless, GPs suffer from poor scalability as the number of training instances N increases. More precisely, they have a cubic cost with respect to N. To overcome this problem, sparse GP approximations are often used, where a set of M β‰ͺ N inducing points is introduced during training. The location of the inducing points is learned by considering them as parameters of an approximate posterior distribution q. Sparse GPs, combined with variational inference for inferring q, reduce the training cost of GPs to π’ͺ(M^3). Critically, the inducing points determine the flexibility of the model and they are often located in regions of the input space where the latent function changes. A limitation is, however, that for some learning tasks a large number of inducing points may be required to obtain a good prediction performance. To address this limitation, we propose here to amortize the computation of the inducing points locations, as well as the parameters of the variational posterior approximation q. For this, we use a neural network that receives the observed data as an input and outputs the inducing points locations and the parameters of q. We evaluate our method in several experiments, showing that it performs similar or better than other state-of-the-art sparse variational GP approaches. However, with our method the number of inducing points is reduced drastically due to their dependency on the input data. This makes our method scale to larger datasets and have faster training and prediction times.

βˆ™ 05/17/2020

Global inducing point variational posteriors for Bayesian neural networks and deep Gaussian processes

Variational inference is a popular approach to reason about uncertainty ...
βˆ™ 07/21/2021

Adaptive Inducing Points Selection For Gaussian Processes

Gaussian Processes (GPs) are flexible non-parametric models with strong ...
βˆ™ 11/10/2020

Sparse within Sparse Gaussian Processes using Neighbor Information

Approximations to Gaussian processes based on inducing variables, combin...
βˆ™ 11/03/2020

Transforming Gaussian Processes With Normalizing Flows

Gaussian Processes (GPs) can be used as flexible, non-parametric functio...
βˆ™ 09/08/2018

Non-Parametric Variational Inference with Graph Convolutional Networks for Gaussian Processes

Inference for GP models with non-Gaussian noises is computationally expe...
βˆ™ 04/10/2022

Gaussian Processes for Missing Value Imputation

Missing values are common in many real-life datasets. However, most of t...
βˆ™ 05/30/2022

Posterior and Computational Uncertainty in Gaussian Processes

Gaussian processes scale prohibitively with the size of the dataset. In ...

Please sign up or login with your details

Forgot password? Click here to reset