Progressive Distillation
Progressive distillation is a machine learning technique focused on efficiently transferring knowledge from a large, complex "teacher" model to a smaller, faster "student" model. Current research emphasizes its application across diverse domains, including image generation (using diffusion models and rectified flows), natural language processing (improving LLMs and BERT models), and 3D scene representation (enhancing NeRFs and knowledge graphs). This approach offers significant advantages by enabling the deployment of high-performing models on resource-constrained devices while accelerating inference times and reducing computational costs in various applications.
Papers
November 22, 2022
October 23, 2022
September 27, 2022
April 5, 2022
February 1, 2022