Learnability and Complexity of Quantum Samples

10/22/2020
by   Murphy Yuezhen Niu, et al.
3

Given a quantum circuit, a quantum computer can sample the output distribution exponentially faster in the number of bits than classical computers. A similar exponential separation has yet to be established in generative models through quantum sample learning: given samples from an n-qubit computation, can we learn the underlying quantum distribution using models with training parameters that scale polynomial in n under a fixed training time? We study four kinds of generative models: Deep Boltzmann machine (DBM), Generative Adversarial Networks (GANs), Long Short-Term Memory (LSTM) and Autoregressive GAN, on learning quantum data set generated by deep random circuits. We demonstrate the leading performance of LSTM in learning quantum samples, and thus the autoregressive structure present in the underlying quantum distribution from random quantum circuits. Both numerical experiments and a theoretical proof in the case of the DBM show exponentially growing complexity of learning-agent parameters required for achieving a fixed accuracy as n increases. Finally, we establish a connection between learnability and the complexity of generative models by benchmarking learnability against different sets of samples drawn from probability distributions of variable degrees of complexities in their quantum and classical representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2018

Differentiable Learning of Quantum Circuit Born Machine

Quantum circuit Born machines are generative models which represent the ...
research
10/31/2019

Quantum Wasserstein Generative Adversarial Networks

The study of quantum generative models is well-motivated, not only becau...
research
05/04/2023

Trainability barriers and opportunities in quantum generative modeling

Quantum generative models, in providing inherently efficient sampling st...
research
10/24/2022

Protocols for classically training quantum generative models on probability distributions

Quantum Generative Modelling (QGM) relies on preparing quantum states an...
research
08/08/2023

Application-Oriented Benchmarking of Quantum Generative Learning Using QUARK

Benchmarking of quantum machine learning (QML) algorithms is challenging...
research
09/18/2023

Quantum Wasserstein GANs for State Preparation at Unseen Points of a Phase Diagram

Generative models and in particular Generative Adversarial Networks (GAN...
research
03/07/2022

DATGAN: Integrating expert knowledge into deep learning for synthetic tabular data

Synthetic data can be used in various applications, such as correcting b...

Please sign up or login with your details

Forgot password? Click here to reset