Progressive Distillation
Progressive distillation is a machine learning technique focused on efficiently transferring knowledge from a large, complex "teacher" model to a smaller, faster "student" model. Current research emphasizes its application across diverse domains, including image generation (using diffusion models and rectified flows), natural language processing (improving LLMs and BERT models), and 3D scene representation (enhancing NeRFs and knowledge graphs). This approach offers significant advantages by enabling the deployment of high-performing models on resource-constrained devices while accelerating inference times and reducing computational costs in various applications.
Papers
October 15, 2024
October 7, 2024
August 14, 2024
July 10, 2024
May 31, 2024
May 30, 2024
May 25, 2024
May 7, 2024
April 21, 2024
March 2, 2024
January 19, 2024
November 30, 2023
September 12, 2023
August 17, 2023
August 12, 2023
July 29, 2023
May 24, 2023
April 8, 2023
February 3, 2023