Paper ID: 2304.11923

Improving Knowledge Distillation via Transferring Learning Ability

Long Liu, Tong Li, Hui Cheng

Existing knowledge distillation methods generally use a teacher-student approach, where the student network solely learns from a well-trained teacher. However, this approach overlooks the inherent differences in learning abilities between the teacher and student networks, thus causing the capacity-gap problem. To address this limitation, we propose a novel method called SLKD.

Submitted: Apr 24, 2023