Posterior Robustness with Milder Conditions: Contamination Models Revisited
Robust Bayesian linear regression is a classical but essential statistical tool. Although novel robustness properties of posterior distributions have been proved recently under a certain class of error distributions, their sufficient conditions are restrictive and exclude several important situations. In this work, we revisit a classical two-component mixture model for response variables, also known as contamination model, where one component is a light-tailed regression model and the other component is heavy-tailed. The latter component is independent of the regression parameters, which is crucial in proving the posterior robustness. We obtain new sufficient conditions for posterior (non-)robustness and reveal non-trivial robustness results by using those conditions. In particular, we find that even the Student-t error distribution can achieve the posterior robustness in our framework. A numerical study is performed to check the Kullback-Leibler divergence between the posterior distribution based on full data and that based on data obtained by removing outliers.
READ FULL TEXT