Semismooth Newton Coordinate Descent Algorithm for Elastic-Net Penalized Huber Loss Regression and Quantile Regression

09/09/2015
by   Congrui Yi, et al.
0

We propose an algorithm, semismooth Newton coordinate descent (SNCD), for the elastic-net penalized Huber loss regression and quantile regression in high dimensional settings. Unlike existing coordinate descent type algorithms, the SNCD updates each regression coefficient and its corresponding subgradient simultaneously in each iteration. It combines the strengths of the coordinate descent and the semismooth Newton algorithm, and effectively solves the computational challenges posed by dimensionality and nonsmoothness. We establish the convergence properties of the algorithm. In addition, we present an adaptive version of the "strong rule" for screening predictors to gain extra efficiency. Through numerical experiments, we demonstrate that the proposed algorithm is very efficient and scalable to ultra-high dimensions. We illustrate the application via a real data example.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2022

Retire: Robust Expectile Regression in High Dimensions

High-dimensional data can often display heterogeneity due to heterosceda...
research
05/05/2022

A Unified Algorithm for Penalized Convolution Smoothed Quantile Regression

Penalized quantile regression (QR) is widely used for studying the relat...
research
11/26/2013

A Blockwise Descent Algorithm for Group-penalized Multiresponse and Multinomial Regression

In this paper we purpose a blockwise descent algorithm for group-penaliz...
research
03/04/2022

Improved Pathwise Coordinate Descent for Power Penalties

Pathwise coordinate descent algorithms have been used to compute entire ...
research
12/17/2020

l1-norm quantile regression screening rule via the dual circumscribed sphere

l1-norm quantile regression is a common choice if there exists outlier o...
research
05/12/2023

Extended ADMM for general penalized quantile regression with linear constraints in big data

Quantile regression (QR) can be used to describe the comprehensive relat...
research
10/09/2018

SNAP: A semismooth Newton algorithm for pathwise optimization with optimal local convergence rate and oracle properties

We propose a semismooth Newton algorithm for pathwise optimization (SNAP...

Please sign up or login with your details

Forgot password? Click here to reset