Prompt Distillation
Prompt distillation is a technique for transferring knowledge from a large, powerful "teacher" model to a smaller, more efficient "student" model by leveraging prompts—carefully crafted input instructions—as a means of knowledge transfer. Current research focuses on unsupervised methods, particularly for vision-language models and large language models, aiming to improve zero-shot generalization and reduce the need for labeled data. This approach is significant because it enables the deployment of powerful models on resource-constrained devices and facilitates more efficient fine-tuning for specific tasks, impacting both research on model efficiency and practical applications requiring rapid adaptation to new domains.
Papers
July 18, 2024
July 3, 2024
March 5, 2024
February 19, 2024
December 11, 2023
November 16, 2023
June 25, 2023