Continual learning of quantum state classification with gradient episodic memory

03/26/2022
by   Haozhen Situ, et al.
0

Continual learning is one of the many areas of machine learning research. For the goal of strong artificial intelligence that can mimic human-level intelligence, AI systems would have the ability to adapt to ever-changing scenarios and learn new knowledge continuously without forgetting previously acquired knowledge. A phenomenon called catastrophic forgetting emerges when a machine learning model is trained across multiple tasks. The model's performance on previously learned tasks may drop dramatically during the learning process of the newly seen task. Some continual learning strategies have been proposed to address the catastrophic forgetting problem. Recently, continual learning has also been studied in the context of quantum machine learning. By leveraging the elastic weight consolidation method, a single quantum classifier can perform multiple tasks after being trained consecutively on those tasks. In this work, we incorporate the gradient episodic memory method to train a variational quantum classifier. The gradient of the current task is projected to the closest gradient, avoiding the increase of the loss at previous tasks, but allowing the decrease. We use six quantum state classification tasks to benchmark this method. Numerical simulation results show that better performance is obtained compared to the elastic weight consolidation method. Furthermore, positive transfer of knowledge to previous tasks is observed, which means the classifier's performance on previous tasks is enhanced rather than compromised while learning a new task.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset