Maximum Regularized Likelihood Estimators: A General Prediction Theory and Applications

10/09/2017
by   Rui Zhuang, et al.
0

Maximum regularized likelihood estimators (MRLEs) are arguably the most established class of estimators in high-dimensional statistics. In this paper, we derive guarantees for MRLEs in Kullback-Leibler divergence, a general measure of prediction accuracy. We assume only that the densities have a convex parametrization and that the regularization is definite and positive homogenous. The results thus apply to a very large variety of models and estimators, such as tensor regression and graphical models with convex and non-convex regularized methods. A main conclusion is that MRLEs are broadly consistent in prediction - regardless of whether restricted eigenvalues or similar conditions hold.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset