Relation Distillation

Relation distillation in deep learning focuses on transferring relational knowledge, rather than just individual feature representations, from a larger "teacher" model to a smaller "student" model. Current research emphasizes distilling various types of relations, including instance-instance, prototype-instance, and pixel-wise relations, across diverse tasks like face recognition, semantic segmentation, and object detection, often employing contrastive learning and graph convolutional networks. This technique improves the performance and efficiency of smaller models, addressing challenges like catastrophic forgetting in continual learning and enhancing the accuracy of resource-constrained applications.

Papers