Map Distillation
Map distillation is a machine learning technique focused on transferring knowledge from a complex "teacher" model to a smaller, more efficient "student" model, improving the student's performance while reducing computational costs. Current research explores various distillation methods, including those leveraging attention mechanisms, variational inference for diffusion models and mixtures of experts, and hierarchical approaches for handling multi-modal data or complex structures like ICD codes. This technique is significant for improving the efficiency and accessibility of various models across diverse applications, ranging from image recognition and video analysis to generative modeling and medical diagnosis.
Papers
November 15, 2024
August 21, 2024
June 18, 2024
April 17, 2024
August 9, 2023
April 28, 2023
April 11, 2023
March 28, 2023
March 4, 2023
January 30, 2023
January 11, 2023
December 29, 2022