Class Incremental Continual Learning
Class-incremental continual learning (CICL) focuses on training machine learning models that can learn new classes sequentially from a data stream without forgetting previously learned information, a significant challenge for traditional deep learning methods. Current research emphasizes techniques like memory replay (with compressed or proxy-based approaches), parameter-efficient fine-tuning of pre-trained models, and the use of contrastive learning and knowledge distillation to improve knowledge retention and prevent catastrophic forgetting. This field is crucial for developing robust and adaptable AI systems capable of handling real-world scenarios with evolving data distributions, impacting applications ranging from medical image analysis and autonomous driving to personalized dialog systems.
Papers
Domain-Agnostic Neural Architecture for Class Incremental Continual Learning in Document Processing Platform
Mateusz Wójcik, Witold Kościukiewicz, Mateusz Baran, Tomasz Kajdanowicz, Adam Gonczarek
CILF:Causality Inspired Learning Framework for Out-of-Distribution Vehicle Trajectory Prediction
Shengyi Li, Qifan Xue, Yezhuo Zhang, Xuanpeng Li