A step further towards automatic and efficient reversible jump algorithms

11/05/2019
by   Philippe Gagnon, et al.
0

Incorporating information about the target distribution in proposal mechanisms generally increases the efficiency of Markov chain Monte Carlo algorithms, comparatively to those based on naive random walks. Hamiltonian Monte Carlo is a successful example of fixed-dimensional algorithms incorporating gradient information. In trans-dimensional algorithms, Green (2003) recommended to generate the parameter proposals during model switches from normal distributions with informative means and covariance matrices. These proposal distributions can be viewed as approximating the limiting parameter distributions, where the limit is with regard to the sample size. Models are typically proposed naively. In this paper, we build on the approach of Zanella (2019) for discrete spaces to incorporate information about neighbouring models. More specifically, we rely on approximations to posterior model probabilities that are asymptotically exact, as the sample size increases. We prove that, as expected, the samplers combining this approach with that of Green (2003) behave like those able to generate from both the model distribution and parameter distributions in the large sample regime. We also prove that the proposed strategy is optimal when the posterior model probabilities concentrate. We review generic methods improving parameter proposals when the sample size is not large enough. We show how we can leverage these methods to improve model proposals as well. The methodology is applied to a real-data example. Detailed guidelines to fully automate the methodology implementation are provided. The code is available online.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset