Orthogonal Decomposition of Tensor Trains

10/08/2020
by   Karim Halaseh, et al.
0

In this paper we study the problem of recovering a tensor network decomposition of a given tensor 𝒯 in which the tensors at the vertices of the network are orthogonally decomposable. Specifically, we consider tensor networks in the form of tensor trains (aka matrix product states). When the tensor train has length 2, and the orthogonally decomposable tensors at the two vertices of the network are symmetric, we show how to recover the decomposition by considering random linear combinations of slices. Furthermore, if the tensors at the vertices are symmetric but not orthogonally decomposable, we show that a whitening procedure can transform the problem into an orthogonal one, thereby yielding a solution for the decomposition of the tensor. When the tensor network has length 3 or more and the tensors at the vertices are symmetric and orthogonally decomposable, we provide an algorithm for recovering them subject to some rank conditions. Finally, in the case of tensor trains of length two in which the tensors at the vertices are orthogonally decomposable but not necessarily symmetric, we show that the decomposition problem reduces to the problem of a novel matrix decomposition, that of an orthogonal matrix multiplied by diagonal matrices on either side. We provide two solutions for the full-rank tensor case using Sinkhorn's theorem and Procrustes' algorithm, respectively, and show that the Procrustes-based solution can be generalized to any rank case. We conclude with a multitude of open problems in linear and multilinear algebra that arose in our study.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset