Parallel Markov Chain Monte Carlo for Bayesian Hierarchical Models with Big Data, in Two Stages
Due to the escalating growth of big data sets in recent years, new parallel computing methods have been developed for large scale Bayesian analysis. These methods partition large data sets by observations into subsets, perform independent Bayesian Markov chain Monte Carlo (MCMC) analysis on the subsets, and combine subset posteriors to estimate full data posteriors. However, for Bayesian nested hierarchical models, typically only a few parameters are common for the full data set, with most parameters being group-specific. Thus, parallel MCMC methods that take into account the structure of the model and split the full data set by groups rather than by observations are a more natural approach for analysis. Here, we adapt and extend a recently introduced two-stage Bayesian hierarchical modeling approach, and we partition complete data sets by groups. In stage 1, the group-specific parameters are estimated in parallel, independently of other groups. The stage 1 posteriors are used as proposal distributions in stage 2, where the target distribution is the full model. Using three-level and four-level models, we show in simulation studies and a real data analysis that our method approximates the full data analysis closely, with greatly reduced computation times; we also detail the advantages of our method versus existing parallel MCMC computing methods.
READ FULL TEXT