Catastrophic Interference
Catastrophic interference describes the phenomenon where learning new information causes a neural network to forget previously learned information, hindering continual learning. Current research focuses on mitigating this effect through techniques like sparse adaptation (identifying and optimizing only a subset of model weights), interference-free low-rank adaptation, and weighted training methods that account for data distribution shifts during model retraining. Addressing catastrophic interference is crucial for improving the robustness and efficiency of machine learning models across diverse applications, ranging from recommendation systems and A/B testing to continual learning in robotics and natural language processing.
57papers
Papers
February 17, 2025
January 21, 2025
December 7, 2024
December 6, 2024
September 30, 2024
September 24, 2024