Paper ID: 2305.09738
CQural: A Novel CNN based Hybrid Architecture for Quantum Continual Machine Learning
Sanyam Jain
Training machine learning models in an incremental fashion is not only important but also an efficient way to achieve artificial general intelligence. The ability that humans possess of continuous or lifelong learning helps them to not forget previously learned tasks. However, current neural network models are prone to catastrophic forgetting when it comes to continual learning. Many researchers have come up with several techniques in order to reduce the effect of forgetting from neural networks, however, all techniques are studied classically with a very less focus on changing the machine learning model architecture. In this research paper, we show that it is not only possible to circumvent catastrophic forgetting in continual learning with novel hybrid classical-quantum neural networks, but also explains what features are most important to learn for classification. In addition, we also claim that if the model is trained with these explanations, it tends to give better performance and learn specific features that are far from the decision boundary. Finally, we present the experimental results to show comparisons between classical and classical-quantum hybrid architectures on benchmark MNIST and CIFAR-10 datasets. After successful runs of learning procedure, we found hybrid neural network outperforms classical one in terms of remembering the right evidences of the class-specific features.
Submitted: May 16, 2023