Contrastive Distillation
Contrastive distillation is a knowledge transfer technique that improves the performance of smaller "student" models by learning from larger, more powerful "teacher" models. Current research focuses on applying this method across diverse modalities (images, LiDAR, event data, text, speech) and tasks (classification, segmentation, retrieval, generation), often incorporating contrastive learning to enhance feature representation and generalization. This approach is particularly valuable for resource-constrained environments or when labeled data is scarce, offering a path towards more efficient and adaptable machine learning models with broad applications in computer vision, natural language processing, and other fields.
Papers
September 30, 2022
July 4, 2022
June 6, 2022
May 11, 2022
March 21, 2022
December 7, 2021