Self Distillation
Self-distillation is a machine learning technique where a model learns from its own predictions, improving performance and efficiency without requiring a separate teacher model. Current research focuses on applying self-distillation to diverse tasks and model architectures, including spiking neural networks, transformers, and various deep learning models for image, point cloud, and natural language processing. This approach is particularly valuable for resource-constrained environments, enabling model compression and improved performance in scenarios with limited data or computational power, impacting fields like robotics, medical imaging, and natural language understanding.
Papers
December 10, 2023
November 3, 2023
November 2, 2023
October 31, 2023
October 29, 2023
October 20, 2023
October 3, 2023
September 15, 2023
August 28, 2023
August 9, 2023
August 5, 2023
July 22, 2023
July 12, 2023
July 11, 2023
July 3, 2023
June 26, 2023
June 5, 2023
May 31, 2023
May 20, 2023
May 17, 2023