Optimal Subsampling Algorithms for Big Data Generalized Linear Models

06/18/2018
by   Mingyao Ai, et al.
0

To fast approximate the maximum likelihood estimator with massive data, Wang et al. (JASA, 2017) proposed an Optimal Subsampling Method under the A-optimality Criterion (OSMAC) for in logistic regression. This paper extends the scope of the OSMAC framework to include generalized linear models with canonical link functions. The consistency and asymptotic normality of the estimator from a general subsampling algorithm are established, and optimal subsampling probabilities under the A- and L-optimality criteria are derived. Furthermore, using Frobenius norm matrix concentration inequality, finite sample properties of the subsample estimator based on optimal subsampling probabilities are derived. Since the optimal subsampling probabilities depend on the full data estimate, an adaptive two-step algorithm is developed. Asymptotic normality and optimality of the estimator from this adaptive algorithm are established. The proposed methods are illustrated and evaluated through numerical experiments on simulated and real datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset