Soft Label Distillation
Soft label distillation is a machine learning technique that improves the performance of smaller, "student" models by leveraging the knowledge embedded in the predictions (soft labels) of larger, "teacher" models. Current research focuses on refining this process through various strategies, including incorporating label noise during teacher training, optimizing prototype-based soft label generation for imbalanced datasets, and addressing fairness issues by adjusting the smoothness of soft labels across different classes. These advancements enhance model robustness, efficiency, and generalization capabilities, impacting diverse applications such as image classification, object detection, and semantic segmentation.
Papers
September 20, 2024
July 18, 2024
March 25, 2024
December 9, 2023
April 2, 2023
December 11, 2022
September 3, 2022
December 9, 2021
November 4, 2021