Paper ID: 2405.11465
Effective In-Context Example Selection through Data Compression
Zhongxiang Sun, Kepu Zhang, Haoyu Wang, Xiao Zhang, Jun Xu
In-context learning has been extensively validated in large language models. However, the mechanism and selection strategy for in-context example selection, which is a crucial ingredient in this approach, lacks systematic and in-depth research. In this paper, we propose a data compression approach to the selection of in-context examples. We introduce a two-stage method that can effectively choose relevant examples and retain sufficient information about the training dataset within the in-context examples. Our method shows a significant improvement of an average of 5.90% across five different real-world datasets using four language models.
Submitted: May 19, 2024