Greater Than the Sum of its Parts: Computationally Flexible Bayesian Hierarchical Modeling

10/23/2020
by   Devin S. Johnson, et al.
0

We propose a multistage method for making inference at all levels of a Bayesian hierarchical model (BHM) using natural data partitions to increase efficiency by allowing computations to take place in parallel form using software that is most appropriate for each data partition. The full hierarchical model is then approximated by the product of independent normal distributions for the data component of the model. In the second stage, the Bayesian maximum a posteriori (MAP) estimator is found by maximizing the approximated posterior density with respect to the parameters. If the parameters of the model can be represented as normally distributed random effects then the second stage optimization is equivalent to fitting a multivariate normal linear mixed model. This method can be extended to account for common fixed parameters shared between data partitions, as well as parameters that are distinct between partitions. In the case of distinct parameter estimation, we consider a third stage that re-estimates the distinct parameters for each data partition based on the results of the second stage. This allows more information from the entire data set to properly inform the posterior distributions of the distinct parameters. The method is demonstrated with two ecological data sets and models, a random effects GLM and an Integrated Population Model (IPM). The multistage results were compared to estimates from models fit in single stages to the entire data set. Both examples demonstrate that multistage point and posterior standard deviation estimates closely approximate those obtained from fitting the models with all data simultaneously and can therefore be considered for fitting hierarchical Bayesian models when it is computationally prohibitive to do so in one step.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset