Posterior Averaging Information Criterion

09/19/2020
by   Shouhao Zhou, et al.
0

We propose a new model selection method, the posterior averaging information criterion, for Bayesian model assessment from a predictive perspective. The theoretical foundation is built on the Kullback-Leibler divergence to quantify the similarity between the proposed candidate model and the underlying true model. From a Bayesian perspective, our method evaluates the candidate models over the entire posterior distribution in terms of predicting a future independent observation. Without assuming that the true distribution is contained in the candidate models, the new criterion is developed by correcting the asymptotic bias of the posterior mean of the log-likelihood against its expected log-likelihood. It can be generally applied even for Bayesian models with degenerate non-informative prior. The simulation in both normal and binomial settings demonstrates decent small sample performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset