A Better Computational Framework for L_2E Regression
Building on previous research of Chi and Chi (2022), the current paper revisits estimation in robust structured regression under the L_2E criterion. We adopt the majorization-minimization (MM) principle to design a new algorithm for updating the vector of regression coefficients. Our sharp majorization achieves faster convergence than the previous alternating projected gradient descent algorithm (Chi and Chi, 2022). In addition, we reparameterize the model by substituting precision for scale and estimate precision via a modified Newton's method. This simplifies and accelerates overall estimation. Finally, we introduce distance to set penalties to allow constrained estimation under nonconvex constraint sets. This tactic also improves performance in coefficient estimation and structure recovery. We demonstrate the merits of the refined framework through a rich set of simulation examples.
READ FULL TEXT