Good Teacher
Research on "good teacher" models focuses on improving knowledge transfer and reducing biases in various machine learning contexts, primarily through knowledge distillation techniques. Current efforts center on optimizing teacher-student model architectures, including the use of ensembles and specialized teachers, and refining distillation algorithms to enhance efficiency and robustness, particularly in data-limited settings. These advancements have significant implications for improving model performance, reducing computational costs, and mitigating biases in diverse applications ranging from image recognition and natural language processing to educational technology.
Papers
LLM-Driven Learning Analytics Dashboard for Teachers in EFL Writing Education
Minsun Kim, SeonGyeom Kim, Suyoun Lee, Yoosang Yoon, Junho Myung, Haneul Yoo, Hyunseung Lim, Jieun Han, Yoonsu Kim, So-Yeon Ahn, Juho Kim, Alice Oh, Hwajung Hong, Tak Yeon Lee
Adversarial Score identity Distillation: Rapidly Surpassing the Teacher in One Step
Mingyuan Zhou, Huangjie Zheng, Yi Gu, Zhendong Wang, Hai Huang
Elementary School Students' and Teachers' Perceptions Towards Creative Mathematical Writing with Generative AI
Yukyeong Song, Jinhee Kim, Wanli Xing, Zifeng Liu, Chenglu Li, Hyunju Oh
SwiftBrush v2: Make Your One-step Diffusion Model Better Than Its Teacher
Trung Dao, Thuan Hoang Nguyen, Thanh Le, Duc Vu, Khoi Nguyen, Cuong Pham, Anh Tran