Shot Class Incremental Learning
Few-shot class-incremental learning (FSCIL) tackles the challenge of continuously training machine learning models on new classes with limited data, while preventing catastrophic forgetting of previously learned classes. Current research focuses on improving model adaptability and stability through techniques like contrastive learning, dynamic adaptation strategies (e.g., using selective state space models), and leveraging pre-trained models or prompt learning with vision transformers. These advancements aim to address inherent biases in existing evaluation metrics and improve performance across both previously and newly learned classes, impacting various applications requiring continuous learning from limited data, such as real-time audio processing and medical image classification.