Continual learning of quantum state classification with gradient episodic memory

by   Haozhen Situ, et al.

Continual learning is one of the many areas of machine learning research. For the goal of strong artificial intelligence that can mimic human-level intelligence, AI systems would have the ability to adapt to ever-changing scenarios and learn new knowledge continuously without forgetting previously acquired knowledge. A phenomenon called catastrophic forgetting emerges when a machine learning model is trained across multiple tasks. The model's performance on previously learned tasks may drop dramatically during the learning process of the newly seen task. Some continual learning strategies have been proposed to address the catastrophic forgetting problem. Recently, continual learning has also been studied in the context of quantum machine learning. By leveraging the elastic weight consolidation method, a single quantum classifier can perform multiple tasks after being trained consecutively on those tasks. In this work, we incorporate the gradient episodic memory method to train a variational quantum classifier. The gradient of the current task is projected to the closest gradient, avoiding the increase of the loss at previous tasks, but allowing the decrease. We use six quantum state classification tasks to benchmark this method. Numerical simulation results show that better performance is obtained compared to the elastic weight consolidation method. Furthermore, positive transfer of knowledge to previous tasks is observed, which means the classifier's performance on previous tasks is enhanced rather than compromised while learning a new task.


page 1

page 2

page 3

page 4


Quantum Continual Learning Overcoming Catastrophic Forgetting

Catastrophic forgetting describes the fact that machine learning models ...

Gradient Episodic Memory for Continual Learning

One major obstacle towards AI is the poor ability of models to solve new...

CQural: A Novel CNN based Hybrid Architecture for Quantum Continual Machine Learning

Training machine learning models in an incremental fashion is not only i...

Schematic Memory Persistence and Transience for Efficient and Robust Continual Learning

Continual learning is considered a promising step towards next-generatio...

Flattening Sharpness for Dynamic Gradient Projection Memory Benefits Continual Learning

The backpropagation networks are notably susceptible to catastrophic for...

Statistical mechanics of continual learning: variational principle and mean-field potential

An obstacle to artificial general intelligence is set by the continual l...

Reviewing continual learning from the perspective of human-level intelligence

Humans' continual learning (CL) ability is closely related to Stability ...

Please sign up or login with your details

Forgot password? Click here to reset