Parameter estimation for high dimensional change point regression models without grid search

05/09/2018
by   Abhishek Kaul, et al.
0

We propose an L1 regularized estimator for the parameters of a high dimensional change point regression model and provide the corresponding rates of convergence for the regression as well change point estimates. Importantly, the computational cost of our estimator is 2Lasso(n,p), where Lasso(n,p) represents the computational burden of one Lasso optimization. In comparison, existing grid search based approaches to this problem require a computational cost of at least nLasso(n,p) optimizations. We work under a subgaussian random design where the underlying assumptions in our study are milder than those currently assumed in the high dimensional change point regression literature. We allow the true change point parameter τ_0n to possibly move to the boundaries of its parametric space, and the jump size β_0-γ_0_2 to possibly diverge as n increases. We also characterize the corresponding effects of these quantities on the rates of convergence of the regression and change point estimates. Simulations are performed to empirically evaluate performance of the proposed estimators. The methodology is applied to community level socio-economic data of the U.S., collected from the 1990 U.S. census and other sources.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset