Contrastive Distillation

Contrastive distillation is a knowledge transfer technique that improves the performance of smaller "student" models by learning from larger, more powerful "teacher" models. Current research focuses on applying this method across diverse modalities (images, LiDAR, event data, text, speech) and tasks (classification, segmentation, retrieval, generation), often incorporating contrastive learning to enhance feature representation and generalization. This approach is particularly valuable for resource-constrained environments or when labeled data is scarce, offering a path towards more efficient and adaptable machine learning models with broad applications in computer vision, natural language processing, and other fields.

Papers