Hedging parameter selection for basis pursuit

05/04/2018
by   Stéphane Chrétien, et al.
0

In Compressed Sensing and high dimensional estimation, signal recovery often relies on sparsity assumptions and estimation is performed via ℓ_1-penalized least-squares optimization, a.k.a. LASSO. The ℓ_1 penalisation is usually controlled by a weight, also called "relaxation parameter", denoted by λ. It is commonly thought that the practical efficiency of the LASSO for prediction crucially relies on accurate selection of λ. In this short note, we propose to consider the hyper-parameter selection problem from a new perspective which combines the Hedge online learning method by Freund and Shapire, with the stochastic Frank-Wolfe method for the LASSO. Using the Hedge algorithm, we show that a our simple selection rule can achieve prediction results comparable to Cross Validation at a potentially much lower computational cost.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset