Markov chain Monte Carlo Methods For Lattice Gaussian Sampling:Convergence Analysis and Enhancement

11/30/2018
by   Zheng Wang, et al.
0

Sampling from lattice Gaussian distribution has emerged as an important problem in coding, decoding and cryptography. In this paper, the classic Gibbs algorithm from Markov chain Monte Carlo (MCMC) methods is demonstrated to be geometrically ergodic for lattice Gaussian sampling, which means the Markov chain arising from it converges exponentially fast to the stationary distribution. Meanwhile, the exponential convergence rate of Markov chain is also derived through the spectral radius of forward operator. Then, a comprehensive analysis regarding to the convergence rate is carried out and two sampling schemes are proposed to further enhance the convergence performance. The first one, referred to as Metropolis-within-Gibbs (MWG) algorithm, improves the convergence by refining the state space of the univariate sampling. On the other hand, the blocked strategy of Gibbs algorithm, which performs the sampling over multivariate at each Markov move, is also shown to yield a better convergence rate than the traditional univariate sampling. In order to perform blocked sampling efficiently, Gibbs-Klein (GK) algorithm is proposed, which samples block by block using Klein's algorithm. Furthermore, the validity of GK algorithm is demonstrated by showing its ergodicity. Simulation results based on MIMO detections are presented to confirm the convergence gain brought by the proposed Gibbs sampling schemes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset