Gaussian Process-Gated Hierarchical Mixtures of Experts

02/09/2023
by   Yuhao Liu, et al.
0

In this paper, we propose novel Gaussian process-gated hierarchical mixtures of experts (GPHMEs) that are used for building gates and experts. Unlike in other mixtures of experts where the gating models are linear to the input, the gating functions of our model are inner nodes built with Gaussian processes based on random features that are non-linear and non-parametric. Further, the experts are also built with Gaussian processes and provide predictions that depend on test data. The optimization of the GPHMEs is carried out by variational inference. There are several advantages of the proposed GPHMEs. One is that they outperform tree-based HME benchmarks that partition the data in the input space. Another advantage is that they achieve good performance with reduced complexity. A third advantage of the GPHMEs is that they provide interpretability of deep Gaussian processes and more generally of deep Bayesian neural networks. Our GPHMEs demonstrate excellent performance for large-scale data sets even with quite modest sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2022

Mixtures of Gaussian Process Experts with SMC^2

Gaussian processes are a key component of many flexible statistical and ...
research
05/31/2019

Neural Likelihoods for Multi-Output Gaussian Processes

We construct flexible likelihoods for multi-output Gaussian process mode...
research
04/26/2023

Mixtures of Gaussian process experts based on kernel stick-breaking processes

Mixtures of Gaussian process experts is a class of models that can simul...
research
10/24/2014

Scalable Nonparametric Bayesian Inference on Point Processes with Gaussian Processes

In this paper we propose the first non-parametric Bayesian model using G...
research
08/21/2020

Biased Mixtures Of Experts: Enabling Computer Vision Inference Under Data Transfer Limitations

We propose a novel mixture-of-experts class to optimize computer vision ...
research
05/30/2019

Enriched Mixtures of Gaussian Process Experts

Mixtures of experts probabilistically divide the input space into region...
research
10/17/2020

Aggregating Dependent Gaussian Experts in Local Approximation

Distributed Gaussian processes (DGPs) are prominent local approximation ...

Please sign up or login with your details

Forgot password? Click here to reset