Self Distillation
Self-distillation is a machine learning technique where a model learns from its own predictions, improving performance and efficiency without requiring a separate teacher model. Current research focuses on applying self-distillation to diverse tasks and model architectures, including spiking neural networks, transformers, and various deep learning models for image, point cloud, and natural language processing. This approach is particularly valuable for resource-constrained environments, enabling model compression and improved performance in scenarios with limited data or computational power, impacting fields like robotics, medical imaging, and natural language understanding.
Papers
May 3, 2023
April 27, 2023
April 25, 2023
April 14, 2023
April 13, 2023
April 5, 2023
March 23, 2023
March 22, 2023
March 16, 2023
March 9, 2023
March 2, 2023
February 27, 2023
February 23, 2023
February 20, 2023
February 14, 2023
February 11, 2023
February 2, 2023
January 30, 2023
January 27, 2023