Relational Knowledge Distillation

Relational knowledge distillation (RKD) focuses on transferring not only individual feature representations but also the relationships between them from a larger "teacher" model to a smaller, more efficient "student" model. Current research emphasizes leveraging graph convolutional networks and transformers to capture these relationships, particularly in challenging domains like low-resolution image recognition, 3D point cloud processing, and cross-domain adaptation. This technique improves model efficiency and robustness, particularly in scenarios with limited data or computational resources, impacting various fields including computer vision, autonomous driving, and knowledge graph integration.

Papers