Relation Distillation
Relation distillation in deep learning focuses on transferring relational knowledge, rather than just individual feature representations, from a larger "teacher" model to a smaller "student" model. Current research emphasizes distilling various types of relations, including instance-instance, prototype-instance, and pixel-wise relations, across diverse tasks like face recognition, semantic segmentation, and object detection, often employing contrastive learning and graph convolutional networks. This technique improves the performance and efficiency of smaller models, addressing challenges like catastrophic forgetting in continual learning and enhancing the accuracy of resource-constrained applications.
Papers
GRDD: A Dataset for Greek Dialectal NLP
Stergios Chatzikyriakidis, Chatrine Qwaider, Ilias Kolokousis, Christina Koula, Dimitris Papadakis, Efthymia Sakellariou
Deep Image Harmonization with Globally Guided Feature Transformation and Relation Distillation
Li Niu, Linfeng Tan, Xinhao Tao, Junyan Cao, Fengjun Guo, Teng Long, Liqing Zhang