Exemplar Free
Exemplar-free class incremental learning (EFCIL) tackles the challenge of training machine learning models on sequential tasks without retaining data from previous tasks, addressing privacy concerns and storage limitations. Current research focuses on developing algorithms that mitigate "catastrophic forgetting"—the loss of previously learned knowledge—through techniques like analytical solutions (e.g., recursive least squares), feature consolidation, and attention mechanisms within various architectures, including convolutional neural networks and vision transformers. Success in EFCIL is crucial for developing robust and privacy-preserving AI systems capable of continuous learning in real-world scenarios with limited resources.
Papers
March 26, 2024
March 23, 2024
February 6, 2024
August 29, 2023
November 22, 2022