Objective priors for divergence-based robust estimation

04/29/2020
by   Tomoyuki Nakagawa, et al.
0

Objective priors for outlier-robust Bayesian estimation based on divergences are considered. It is known that the γ-divergence (or type 0 divergence) has attractive properties for robust parameter estimation (Jones et al. (2001), Fujisawa and Eguchi (2008)). This paper puts its focus on the reference and moment matching priors under quasi-posterior distribution based on the γ-divergence. In general, since such objective priors depend on unknown data generating mechanism, we cannot directly use them in the presence of outliers. Under Huber's ε-contamination model, we show that the proposed priors are approximately robust under the condition on the tail of the contamination distribution without assuming any conditions on the contamination ratio. Some simulation studies are also illustrated.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset