Gradient-Based Markov Chain Monte Carlo for Bayesian Inference With Non-Differentiable Priors
The use of non-differentiable priors in Bayesian statistics has become increasingly popular, in particular in Bayesian imaging analysis. Current state of the art methods are approximate in the sense that they replace the posterior with a smooth approximation via Moreau-Yosida envelopes, and apply gradient-based discretized diffusions to sample from the resulting distribution. We characterize the error of the Moreau-Yosida approximation and propose a novel implementation using underdamped Langevin dynamics. In misson-critical cases, however, replacing the posterior with an approximation may not be a viable option. Instead, we show that Piecewise-Deterministic Markov Processes (PDMP) can be utilized for exact posterior inference from distributions satisfying almost everywhere differentiability. Furthermore, in contrast with diffusion-based methods, the suggested PDMP-based samplers place no assumptions on the prior shape, nor require access to a computationally cheap proximal operator, and consequently have a much broader scope of application. Through detailed numerical examples, including a non-differentiable circular distribution and a non-convex genomics model, we elucidate the relative strengths of these sampling methods on problems of moderate to high dimensions, underlining the benefits of PDMP-based methods when accurate sampling is decisive.
READ FULL TEXT