Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem

01/22/2016
by   Alican Nalci, et al.
0

In this paper, we develop a Bayesian evidence maximization framework to solve the sparse non-negative least squares problem (S-NNLS). We introduce a family of scale mixtures referred as to Rectified Gaussian Scale Mixture (R-GSM) to model the sparsity enforcing prior distribution for the signal of interest. Through proper choice of the mixing density, the R-GSM prior encompasses a wide variety of heavy-tailed distributions such as the rectified Laplacian and rectified Student-t distributions. Utilizing the hierarchical representation induced by the scale mixture prior, an evidence maximization or Type II estimation method based on the expectation-maximization (EM) framework is developed to estimate the hyper-parameters and to obtain a point estimate of the parameter of interest. In the proposed method, called rectified Sparse Bayesian Learning (R-SBL), we provide four alternative approaches that offer a range of options to trade-off computational complexity to quality of the E-step computation. The methods include the Markov Chain Monte Carlo EM, linear minimum mean square estimation, approximate message passing and a diagonal approximation. Through numerical experiments, we show that the proposed R-SBL method outperforms existing S-NNLS solvers in terms of both signal and support recovery.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro