Distillation Learning
Distillation learning is a machine learning technique where a complex, high-performing "teacher" model transfers its knowledge to a smaller, more efficient "student" model. Current research focuses on applying this method to diverse areas, including image super-resolution, medical image segmentation, object detection in spiking neural networks (SNNs), and time series forecasting, often employing techniques like time-aware distillation or feature-based knowledge transfer. This approach is significant because it enables the deployment of accurate models on resource-constrained devices while improving the efficiency and generalizability of various machine learning tasks.
Papers
September 19, 2024
August 14, 2024
August 7, 2024
February 2, 2024
January 31, 2024
January 25, 2024
August 24, 2023
August 21, 2023
July 10, 2023
March 7, 2023
October 5, 2022
September 29, 2022
September 2, 2022
July 29, 2022
December 30, 2021