A Similarity Measure of Gaussian Process Predictive Distributions

by   Lucia Asencio-Martín, et al.

Some scenarios require the computation of a predictive distribution of a new value evaluated on an objective function conditioned on previous observations. We are interested on using a model that makes valid assumptions on the objective function whose values we are trying to predict. Some of these assumptions may be smoothness or stationarity. Gaussian process (GPs) are probabilistic models that can be interpreted as flexible distributions over functions. They encode the assumptions through covariance functions, making hypotheses about new data through a predictive distribution by being fitted to old observations. We can face the case where several GPs are used to model different objective functions. GPs are non-parametric models whose complexity is cubic on the number of observations. A measure that represents how similar is one GP predictive distribution with respect to another would be useful to stop using one GP when they are modelling functions of the same input space. We are really inferring that two objective functions are correlated, so one GP is enough to model both of them by performing a transformation of the prediction of the other function in case of inverse correlation. We show empirical evidence in a set of synthetic and benchmark experiments that GPs predictive distributions can be compared and that one of them is enough to predict two correlated functions in the same input space. This similarity metric could be extremely useful used to discard objectives in Bayesian many-objective optimization.


page 1

page 2

page 3

page 4


Many Objective Bayesian Optimization

Some real problems require the evaluation of expensive and noisy objecti...

Bayesian Regularization on Function Spaces via Q-Exponential Process

Regularization is one of the most important topics in optimization, stat...

Explaining the Uncertain: Stochastic Shapley Values for Gaussian Process Models

We present a novel approach for explaining Gaussian processes (GPs) that...

Compositionally-Warped Gaussian Processes

The Gaussian process (GP) is a nonparametric prior distribution over fun...

Disentangling Derivatives, Uncertainty and Error in Gaussian Process Models

Gaussian Processes (GPs) are a class of kernel methods that have shown t...

Relaxed Gaussian process interpolation: a goal-oriented approach to Bayesian optimization

This work presents a new procedure for obtaining predictive distribution...

Neural Processes

A neural network (NN) is a parameterised function that can be tuned via ...

Please sign up or login with your details

Forgot password? Click here to reset