Robust Bayesian Regression with Synthetic Posterior
Regression models are fundamental tools in statistics, but they typically suffer from outliers. While several robust methods have been proposed based on frequentist approaches, Bayesian methods would be more preferable in terms of easiness of uncertainty quantification of estimation results. In this article, we propose a robust Bayesian method for regression models by introducing a synthetic posterior distribution based on robust divergence. We also consider introducing shrinkage priors for regression coefficients for variable selection. We provide an efficient posterior computation algorithm using Bayesian bootstrap within Gibbs sampling. We carry out simulation studies to evaluate performance of the proposed method, and apply the proposed method to real datasets.
READ FULL TEXT