Expressibility-Enhancing Strategies for Quantum Neural Networks

11/23/2022
by   Yalin Liao, et al.
0

Quantum neural networks (QNNs), represented by parameterized quantum circuits, can be trained in the paradigm of supervised learning to map input data to predictions. Much work has focused on theoretically analyzing the expressive power of QNNs. However, in almost all literature, QNNs' expressive power is numerically validated using only simple univariate functions. We surprisingly discover that state-of-the-art QNNs with strong expressive power can have poor performance in approximating even just a simple sinusoidal function. To fill the gap, we propose four expressibility-enhancing strategies for QNNs: Sinusoidal-friendly embedding, redundant measurement, post-measurement function, and random training data. We analyze the effectiveness of these strategies via mathematical analysis and/or numerical studies including learning complex sinusoidal-based functions. Our results from comparative experiments validate that the four strategies can significantly increase the QNNs' performance in approximating complex multivariable functions and reduce the quantum circuit depth and qubits required.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2018

The Expressive Power of Parameterized Quantum Circuits

Parameterized quantum circuits (PQCs) have been broadly used as a hybrid...
research
07/12/2021

Fock State-enhanced Expressivity of Quantum Machine Learning Models

The data-embedding process is one of the bottlenecks of quantum machine ...
research
08/19/2020

The effect of data encoding on the expressive power of variational quantum machine learning models

Quantum computers can be used for supervised learning by treating parame...
research
06/16/2022

Concentration of Data Encoding in Parameterized Quantum Circuits

Variational quantum algorithms have been acknowledged as a leading strat...
research
05/16/2022

Power and limitations of single-qubit native quantum neural networks

Quantum neural networks (QNNs) have emerged as a leading strategy to est...
research
02/29/2020

Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs

Batch normalization (BatchNorm) has become an indispensable tool for tra...
research
12/13/2017

Regularization and Optimization strategies in Deep Convolutional Neural Network

Convolution Neural Networks, known as ConvNets exceptionally perform wel...

Please sign up or login with your details

Forgot password? Click here to reset