Self Distillation
Self-distillation is a machine learning technique where a model learns from its own predictions, improving performance and efficiency without requiring a separate teacher model. Current research focuses on applying self-distillation to diverse tasks and model architectures, including spiking neural networks, transformers, and various deep learning models for image, point cloud, and natural language processing. This approach is particularly valuable for resource-constrained environments, enabling model compression and improved performance in scenarios with limited data or computational power, impacting fields like robotics, medical imaging, and natural language understanding.
Papers
June 7, 2022
May 18, 2022
May 6, 2022
May 5, 2022
May 3, 2022
May 1, 2022
April 10, 2022
April 5, 2022
March 30, 2022
March 8, 2022
January 26, 2022
January 17, 2022
December 23, 2021
December 22, 2021
December 2, 2021
November 29, 2021
November 25, 2021
November 23, 2021
November 18, 2021