Scale Distillation
Scale distillation is a knowledge transfer technique in machine learning that improves the performance of student models by leveraging the knowledge of teacher models trained on simpler, smaller-scale tasks. Current research focuses on applying this approach to various domains, including image super-resolution (using diffusion models like Stable Diffusion) and semi-supervised object detection, addressing challenges like class imbalance and noisy pseudo-labels. This technique enhances the efficiency and effectiveness of training complex models, particularly beneficial when labeled data is scarce or computational resources are limited, leading to improved performance in downstream tasks.
Papers
January 30, 2024
October 4, 2023