Statistical Inference in Mean-Field Variational Bayes
We conduct non-asymptotic analysis on the mean-field variational inference for approximating posterior distributions in complex Bayesian models that may involve latent variables. We show that the mean-field approximation to the posterior can be well-approximated relative to the Kullback-Leibler divergence discrepancy measure by a normal distribution whose center is the maximum likelihood estimator (MLE). In particular, our results imply that the center of the mean-field approximation matches the MLE up to higher-order terms and there is essentially no loss of efficiency in using it as a point estimator for the parameter in any regular parametric model with latent variables. We also propose a new class of variational weighted likelihood bootstrap (VWLB) methods for quantifying the uncertainty in the mean-field variational inference. The proposed VWLB can be viewed as a new sampling scheme that produces independent samples for approximating the posterior. Comparing with traditional sampling algorithms such Markov Chain Monte Carlo, VWLB can be implemented in parallel and is free of tuning.
READ FULL TEXT