Noise-Augmented ℓ_0 Regularization of Tensor Regression with Tucker Decomposition

02/19/2023
by   Tian Yan, et al.
0

Tensor data are multi-dimension arrays. Low-rank decomposition-based regression methods with tensor predictors exploit the structural information in tensor predictors while significantly reducing the number of parameters in tensor regression. We propose a method named NA_0CT^2 (Noise Augmentation for ℓ_0 regularization on Core Tensor in Tucker decomposition) to regularize the parameters in tensor regression (TR), coupled with Tucker decomposition. We establish theoretically that NA_0CT^2 achieves exact ℓ_0 regularization in linear TR and generalized linear TR on the core tensor from the Tucker decomposition. To our knowledge, NA_0CT^2 is the first Tucker decomposition-based regularization method in TR to achieve ℓ_0 in core tensor. NA_0CT^2 is implemented through an iterative procedure and involves two simple steps in each iteration – generating noisy data based on the core tensor from the Tucker decomposition of the updated parameter estimate and running a regular GLM on noise-augmented data on vectorized predictors. We demonstrate the implementation of NA_0CT^2 and its ℓ_0 regularization effect in both simulation studies and real data applications. The results suggest that NA_0CT^2 improves predictions compared to other decomposition-based TR approaches, with or without regularization and it also helps to identify important predictors though not designed for that purpose.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset