Small Scale
Research on "small scale" in machine learning and related fields focuses on developing efficient and effective models and algorithms that perform well despite limited data or computational resources. Current efforts concentrate on adapting existing architectures like BERT, VAEs, and U-Nets, employing techniques such as knowledge distillation, synthetic data augmentation, and specialized training strategies to overcome challenges posed by small datasets and imbalanced classes. This work is significant because it addresses the limitations of large models, enabling deployment in resource-constrained environments and facilitating applications where large datasets are difficult or impossible to obtain, such as in personalized medicine or specialized robotics.
Papers
Small Total-Cost Constraints in Contextual Bandits with Knapsacks, with Application to Fairness
Evgenii Chzhen, Christophe Giraud, Zhen Li, Gilles Stoltz
VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Zhiwei Hao, Jianyuan Guo, Kai Han, Han Hu, Chang Xu, Yunhe Wang