Contrastive Distillation
Contrastive distillation is a knowledge transfer technique that improves the performance of smaller "student" models by learning from larger, more powerful "teacher" models. Current research focuses on applying this method across diverse modalities (images, LiDAR, event data, text, speech) and tasks (classification, segmentation, retrieval, generation), often incorporating contrastive learning to enhance feature representation and generalization. This approach is particularly valuable for resource-constrained environments or when labeled data is scarce, offering a path towards more efficient and adaptable machine learning models with broad applications in computer vision, natural language processing, and other fields.
Papers
October 10, 2024
September 4, 2024
September 1, 2024
May 23, 2024
May 8, 2024
May 6, 2024
April 22, 2024
March 10, 2024
February 28, 2024
December 7, 2023
November 10, 2023
September 14, 2023
July 24, 2023
April 18, 2023
February 22, 2023
February 17, 2023
December 21, 2022
November 29, 2022
November 1, 2022