Soft Label Distillation

Soft label distillation is a machine learning technique that improves the performance of smaller, "student" models by leveraging the knowledge embedded in the predictions (soft labels) of larger, "teacher" models. Current research focuses on refining this process through various strategies, including incorporating label noise during teacher training, optimizing prototype-based soft label generation for imbalanced datasets, and addressing fairness issues by adjusting the smoothness of soft labels across different classes. These advancements enhance model robustness, efficiency, and generalization capabilities, impacting diverse applications such as image classification, object detection, and semantic segmentation.

Papers