Bayesian Inference for k-Monotone Densities with Applications to Multiple Testing

06/08/2023
by   Kang Wang, et al.
0

Shape restriction, like monotonicity or convexity, imposed on a function of interest, such as a regression or density function, allows for its estimation without smoothness assumptions. The concept of k-monotonicity encompasses a family of shape restrictions, including decreasing and convex decreasing as special cases corresponding to k=1 and k=2. We consider Bayesian approaches to estimate a k-monotone density. By utilizing a kernel mixture representation and putting a Dirichlet process or a finite mixture prior on the mixing distribution, we show that the posterior contraction rate in the Hellinger distance is (n/log n)^- k/(2k + 1) for a k-monotone density, which is minimax optimal up to a polylogarithmic factor. When the true k-monotone density is a finite J_0-component mixture of the kernel, the contraction rate improves to the nearly parametric rate √((J_0 log n)/n). Moreover, by putting a prior on k, we show that the same rates hold even when the best value of k is unknown. A specific application in modeling the density of p-values in a large-scale multiple testing problem is considered. Simulation studies are conducted to evaluate the performance of the proposed method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset