Randomly Aggregated Least Squares for Support Recovery

03/16/2020
by   Ofir Lindenbaum, et al.
0

We study the problem of exact support recovery: given an (unknown) vector θ∈{-1,0,1}^D, we are given access to the noisy measurement y = Xθ + ω, where X ∈R^N × D is a (known) Gaussian matrix and the noise ω∈R^N is an (unknown) Gaussian vector. How small we can choose N and still reliably recover the support of θ? We present RAWLS (Randomly Aggregated UnWeighted Least Squares Support Recovery): the main idea is to take random subsets of the N equations, perform a least squares recovery over this reduced bit of information and then average over many random subsets. We show that the proposed procedure can provably recover an approximation of θ and demonstrate its use in support recovery through numerical examples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro