Neural tensor contractions and the expressive power of deep neural quantum states

03/18/2021
by   Or Sharir, et al.
24

We establish a direct connection between general tensor networks and deep feed-forward artificial neural networks. The core of our results is the construction of neural-network layers that efficiently perform tensor contractions, and that use commonly adopted non-linear activation functions. The resulting deep networks feature a number of edges that closely matches the contraction complexity of the tensor networks to be approximated. In the context of many-body quantum states, this result establishes that neural-network states have strictly the same or higher expressive power than practically usable variational tensor networks. As an example, we show that all matrix product states can be efficiently written as neural-network states with a number of edges polynomial in the bond dimension and depth logarithmic in the system size. The opposite instead does not hold true, and our results imply that there exist quantum states that are not efficiently expressible in terms of matrix product states or practically usable PEPS, but that are instead efficiently expressible with neural network states.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2018

Implementing Entangled States on a Quantum Computer

The study of tensor network theory is an important field and promises a ...
research
01/10/2022

Quantum activation functions for quantum neural networks

The field of artificial neural networks is expected to strongly benefit ...
research
04/06/2018

Quantum Machine Learning Matrix Product States

Matrix product states minimize bipartite correlations to compress the cl...
research
04/21/2021

Scaling of neural-network quantum states for time evolution

Simulating quantum many-body dynamics on classical computers is a challe...
research
06/15/2021

Quantum-inspired event reconstruction with Tensor Networks: Matrix Product States

Tensor Networks are non-trivial representations of high-dimensional tens...
research
12/22/2020

Residual Matrix Product State for Machine Learning

Tensor network (TN), which originates from quantum physics, shows broad ...
research
01/05/2020

On Stability of Tensor Networks and Canonical Forms

Tensor networks such as matrix product states (MPS) and projected entang...

Please sign up or login with your details

Forgot password? Click here to reset