Sampling Requirements and Accelerated Schemes for Sparse Linear Regression with Orthogonal Least-Squares

08/08/2016
by   Abolfazl Hashemi, et al.
0

The Orthogonal Least Squares (OLS) algorithm sequentially selects columns of the coefficient matrix to greedily find an approximate sparse solution to an underdetermined system of linear equations. Previous work on the analysis of OLS has been limited; in particular, there exist no guarantees on the performance of OLS for sparse linear regression from random measurements. In this paper, the problem of inferring a sparse vector from random linear combinations of its components using OLS is studied. For the noiseless scenario, it is shown that when the entries of a coefficient matrix are samples from a Gaussian or a Bernoulli distribution, OLS with high probability recovers a k-sparse m-dimensional sparse vector using O(k m) measurements. Similar result is established for the bounded-noise scenario where an additional condition on the smallest nonzero element of the unknown vector is required. Moreover, generalizations that reduce computational complexity of OLS and thus extend its practical feasibility are proposed. The generalized OLS algorithm is empirically shown to outperform broadly used existing algorithms in terms of accuracy, running time, or both.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset