Unbiased Risk Estimation in the Normal Means Problem via Coupled Bootstrap Techniques

11/17/2021
by   Natalia L. Oliveira, et al.
0

We study a new method for estimating the risk of an arbitrary estimator of the mean vector in the classical normal means problem. The key idea is to generate two auxiliary data vectors, by adding two carefully constructed normal noise vectors to the original data vector. We then train the estimator of interest on the first auxiliary data vector and test it on the second. In order to stabilize the estimate of risk, we average this procedure over multiple draws of the synthetic noise. A key aspect of this coupled bootstrap approach is that it delivers an unbiased estimate of risk under no assumptions on the estimator of the mean vector, albeit for a slightly "harder" version of the original normal means problem, where the noise variance is inflated. We show that, under the assumptions required for Stein's unbiased risk estimator (SURE), a limiting version of the coupled bootstrap estimator recovers SURE exactly (with an infinitesimal auxiliary noise variance and infinite bootstrap samples). We also analyze a bias-variance decomposition of the error of our risk estimator, to elucidate the effects of the variance of the auxiliary noise and the number of bootstrap samples on the accuracy of the risk estimator. Lastly, we demonstrate that our coupled bootstrap risk estimator performs quite favorably in simulated experiments and in an image denoising example.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset