Prompt Distillation

Prompt distillation is a technique for transferring knowledge from a large, powerful "teacher" model to a smaller, more efficient "student" model by leveraging prompts—carefully crafted input instructions—as a means of knowledge transfer. Current research focuses on unsupervised methods, particularly for vision-language models and large language models, aiming to improve zero-shot generalization and reduce the need for labeled data. This approach is significant because it enables the deployment of powerful models on resource-constrained devices and facilitates more efficient fine-tuning for specific tasks, impacting both research on model efficiency and practical applications requiring rapid adaptation to new domains.

Papers