The Inverse Gamma-Gamma Prior for Optimal Posterior Contraction and Multiple Hypothesis Testing

10/12/2017
by   Ray Bai, et al.
0

We study the well-known problem of estimating a sparse n-dimensional unknown mean vector θ = (θ_1, ..., θ_n) with entries corrupted by Gaussian white noise. In the Bayesian framework, continuous shrinkage priors which can be expressed as scale-mixture normal densities are popular for obtaining sparse estimates of θ. In this article, we introduce a new fully Bayesian scale-mixture prior known as the inverse gamma-gamma (IGG) prior. We show that the posterior distribution contracts around the true θ at (near) minimax rate under very mild conditions. To classify true signals (θ_i ≠ 0), we also propose a hypothesis test based on thresholding the posterior mean. In the context of multiple hypothesis testing, Bogdan et al. (2011) introduced the notion of asymptotically Bayes optimality under sparsity (ABOS). Taking the loss function to be the expected number of misclassified tests, our test procedure asymptotically attains the ABOS risk exactly. The IGG prior appears to be the first fully Bayesian continuous shrinkage prior that both attains the minimax posterior contraction rate and that induces a multiple testing rule which is asymptotically Bayes optimal under sparsity. In addition to its theoretical guarantees, simulations show that the IGG outperforms other continuous shrinkage priors for both estimation and classification.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset