Paper ID: 2409.01930

Efficient LLM Context Distillation

Rajesh Upadhayayaya, Zachary Smith, Chritopher Kottmyer, Manish Raj Osti

This paper specifically investigates context distillation a method that extends the utility of task-specific examples by internalizing them, thus augmenting the example set accessible for model inference.

Submitted: Sep 3, 2024