Sparse and Low-Rank Tensor Regression via Parallel Proximal Method

11/29/2019
by   Jiaqi Zhang, et al.
0

Motivated by applications in various scientific fields having demand of predicting relationship between higher-order (tensor) feature and univariate response, we propose a Sparse and Low-rank Tensor Regression model (SLTR). This model enforces sparsity and low-rankness of the tensor coefficient by directly applying ℓ_1 norm and tensor nuclear norm on it respectively, such that (1) the structural information of tensor is preserved and (2) the data interpretation is convenient. To make the solving procedure scalable and efficient, SLTR makes use of the proximal gradient method to optimize two norm regularizers, which can be easily implemented parallelly. Additionally, a tighter convergence rate is proved over three-order tensor data. We evaluate SLTR on several simulated datasets and one fMRI dataset. Experiment results show that, compared with previous models, SLTR is able to obtain a solution no worse than others with much less time cost.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset