Completely Quantum Neural Networks

02/23/2022
by   Steve Abel, et al.
0

Artificial neural networks are at the heart of modern deep learning algorithms. We describe how to embed and train a general neural network in a quantum annealer without introducing any classical element in training. To implement the network on a state-of-the-art quantum annealer, we develop three crucial ingredients: binary encoding the free parameters of the network, polynomial approximation of the activation function, and reduction of binary higher-order polynomials into quadratic ones. Together, these ideas allow encoding the loss function as an Ising model Hamiltonian. The quantum annealer then trains the network by finding the ground state. We implement this for an elementary network and illustrate the advantages of quantum training: its consistency in finding the global minimum of the loss function and the fact that the network training converges in a single annealing step, which leads to short training times while maintaining a high classification performance. Our approach opens a novel avenue for the quantum training of general machine learning models.

READ FULL TEXT
research
12/11/2014

Simulating a perceptron on a quantum computer

Perceptrons are the basic computational unit of artificial neural networ...
research
11/14/2022

An Invitation to Distributed Quantum Neural Networks

Deep neural networks have established themselves as one of the most prom...
research
07/02/2018

Classifying Data with Local Hamiltonians

The goal of this work is to define a notion of a quantum neural network ...
research
01/19/2023

Quantum HyperNetworks: Training Binary Neural Networks in Quantum Superposition

Binary neural networks, i.e., neural networks whose parameters and activ...
research
08/24/2023

Training Neural Networks with Universal Adiabatic Quantum Computing

The training of neural networks (NNs) is a computationally intensive tas...
research
06/25/2020

Recurrent Quantum Neural Networks

Recurrent neural networks are the foundation of many sequence-to-sequenc...
research
01/09/2020

On the loss of learning capability inside an arrangement of neural networks

We analyze the loss of information and the loss of learning capability i...

Please sign up or login with your details

Forgot password? Click here to reset