Mixtures of Gaussian Processes for regression under multiple prior distributions

04/19/2021
by   Sarem Seitz, et al.
0

When constructing a Bayesian Machine Learning model, we might be faced with multiple different prior distributions and thus are required to properly consider them in a sensible manner in our model. While this situation is reasonably well explored for classical Bayesian Statistics, it appears useful to develop a corresponding method for complex Machine Learning problems. Given their underlying Bayesian framework and their widespread popularity, Gaussian Processes are a good candidate to tackle this task. We therefore extend the idea of Mixture models for Gaussian Process regression in order to work with multiple prior beliefs at once - both a analytical regression formula and a Sparse Variational approach are considered. In addition, we consider the usage of our approach to additionally account for the problem of prior misspecification in functional regression problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset