Prior Intensified Information Criterion

10/23/2021
by   Yoshiyuki Ninomiya, et al.
0

The widely applicable information criterion (WAIC) has been used as a model selection criterion for Bayesian statistics in recent years. It is an asymptotically unbiased estimator of the Kullback-Leibler divergence between a Bayesian predictive distribution and the true distribution. Not only is the WAIC theoretically more sound than other information criteria, its usefulness in practice has also been reported. On the other hand, the WAIC is intended for settings in which the prior distribution does not have an asymptotic influence, and as we set the class of the prior distribution to be more complex, it never fails to select the most complex one. To alleviate these concerns, this paper proposed the prior intensified information criterion (PIIC). In addition, it customizes this criterion to incorporate sparse estimation and causal inference. Numerical experiments show that the PIIC clearly outperforms the WAIC in terms of prediction performance when the above concerns are manifested. A real data analysis confirms that the results of variable selection and Bayesian estimators of the WAIC and PIIC differ significantly.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset