Convolutional Normalizing Flows for Deep Gaussian Processes
Deep Gaussian processes (DGPs), a hierarchical composition of GP models, have successfully boosted the expressive power than the single-layer counterpart. However, it is impossible to perform exact inference in DGPs, which has motivated the recent development of variational inference based methods. Unfortunately, these methods either yield a biased posterior belief or are difficult to evaluate the convergence. This paper, on the contrary, introduces a new approach for specifying flexible, arbitrarily complex, and scalable approximate posterior distributions. The posterior distribution is constructed through a normalizing flow (NF) which transforms a simple initial probability into a more complex one through a sequence of invertible transformations. Moreover, a novel convolutional normalizing flow (CNF) is developed to improve the time efficiency and capture dependency between layers. Empirical evaluation demonstrates that CNF DGP outperforms the state-of-the-art approximation methods for DGPs.
READ FULL TEXT