Latent Distillation
Latent distillation is a machine learning technique focused on transferring knowledge from a complex "teacher" model to a simpler, more efficient "student" model, often by leveraging latent representations. Current research emphasizes applications in diverse areas, including medical image analysis, privacy-preserving learning, and accelerating generative models like diffusion models and probabilistic circuits, employing techniques such as generative knowledge distillation and variational autoencoders. This approach offers significant advantages in model compression, improved robustness (e.g., against adversarial attacks), and enhanced generalization across datasets, impacting fields ranging from healthcare to computer vision.
Papers
September 26, 2024
September 4, 2024
July 22, 2024
June 9, 2024
May 27, 2024
April 21, 2024
February 4, 2024
December 14, 2023
November 23, 2023
September 22, 2023
August 20, 2023
March 28, 2023