Relational Knowledge Distillation
Relational knowledge distillation (RKD) focuses on transferring not only individual feature representations but also the relationships between them from a larger "teacher" model to a smaller, more efficient "student" model. Current research emphasizes leveraging graph convolutional networks and transformers to capture these relationships, particularly in challenging domains like low-resolution image recognition, 3D point cloud processing, and cross-domain adaptation. This technique improves model efficiency and robustness, particularly in scenarios with limited data or computational resources, impacting various fields including computer vision, autonomous driving, and knowledge graph integration.
Papers
September 9, 2024
September 1, 2024
July 26, 2024
July 16, 2024
May 28, 2024
July 20, 2023
April 28, 2023
February 22, 2023
February 11, 2023
October 11, 2022
July 25, 2022
March 24, 2022