The trouble with tensor ring decompositions

11/09/2018
by   Kim Batselier, et al.
0

The tensor train decomposition decomposes a tensor into a "train" of 3-way tensors that are interconnected through the summation of auxiliary indices. The decomposition is stable, has a well-defined notion of rank and enables the user to perform various linear algebra operations on vectors and matrices of exponential size in a computationally efficient manner. The tensor ring decomposition replaces the train by a ring through the introduction of one additional auxiliary variable. This article discusses a major issue with the tensor ring decomposition: its inability to compute an exact minimal-rank decomposition from a decomposition with sub-optimal ranks. Both the contraction operation and Hadamard product are motivated from applications and it is shown through simple examples how the tensor ring-rounding procedure fails to retrieve minimal-rank decompositions with these operations. These observations, together with the already known issue of not being able to find a best low-rank tensor ring approximation to a given tensor indicate that the applicability of tensor rings is severely limited.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro