On the Closed-form Proximal Mapping and Efficient Algorithms for Exclusive Lasso Models

02/01/2019
by   Yancheng Yuan, et al.
0

The exclusive lasso regularization based on the ℓ_1,2 norm has become popular recently due to its superior performance over the group lasso regularization. Comparing to the group lasso regularization which enforces the competition on variables among different groups and results in inter-group sparsity, the exclusive lasso regularization also enforces the competition within each group and results in intra-group sparsity. However, to the best of our knowledge, a correct closed-form solution to the proximal mapping of the ℓ_1,2 norm has still been elusive. In this paper, we fill the gap by deriving a closed-form solution for Prox_ρ·_1^2(·) and its generalized Jacobian. Based on the obtained analytical results, we are able to design efficient first and second order algorithms for machine learning models involving the exclusive lasso regularization. Our analytical solution of the proximal mapping for the exclusive lasso regularization can be used to improve the efficiency of existing algorithms relying on the efficient computation of the proximal mapping.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset