Context Learning
In-context learning (ICL) is a paradigm shift in machine learning, focusing on enabling models to adapt to new tasks using only a few examples provided within the input, without requiring parameter updates. Current research emphasizes understanding ICL's mechanisms, particularly within transformer-based large language models, and improving its effectiveness through techniques like enhanced example selection, chain-of-thought prompting, and addressing issues such as spurious correlations and copy bias. This research is significant because ICL offers a more efficient and adaptable approach to many machine learning problems, impacting fields ranging from natural language processing and computer vision to scientific computing and beyond.
720papers
Papers
April 6, 2025
M2IV: Towards Efficient and Fine-grained Multimodal In-Context Learning in Large Vision-Language Models
Yanshu Li, Hongyang He, Yi Cao, Qisen Cheng, Xiang Fu, Ruixiang TangBrown University●University of Warwick●Independent Researcher●Samsung●Boston University●Rutgers UniversityGating is Weighting: Understanding Gated Linear Attention through In-context Learning
Yingcong Li, Davoud Ataee Tarzanagh, Ankit Singh Rawat, Maryam Fazel, Samet OymakUniversity of Michigan●University of Pennsylvania●Google Research NYC●University of Washington
April 1, 2025
Context-Aware Human Behavior Prediction Using Multimodal Large Language Models: Challenges and Insights
Yuchen Liu, Lino Lerch, Luigi Palmieri, Andrey Rudenko, Sebastian Koch, Timo Ropinski, Marco AielloRobert Bosch GmbH●Ulm University●University of StuttgartIn-Context Learning for Zero-Shot Speed Estimation of BLDC motors
Alessandro Colombo, Riccardo Busetto, Valentina Breschi, Marco Forgione, Dario Piga, Simone FormentinPolitecnico di Milano●SUPSI●Eindhoven University of TechnologyOn the Consistency of Multilingual Context Utilization in Retrieval-Augmented Generation
Jirui Qi, Raquel Fernández, Arianna BisazzaUniversity of Groningen●University of Amsterdam
March 31, 2025
Contextualize-then-Aggregate: Circuits for In-Context Learning in Gemma-2 2B
Aleksandra Bakalova, Yana Veitsman, Xinting Huang, Michael HahnSaarland Informatics Campus●Saarland UniversityImplicit In-Context Learning: Evidence from Artificial Language Experiments
Xiaomeng Ma, Qihui XuAWS●Ohio State UniversityAn extension of linear self-attention for in-context learning
Katsuyuki HagiwaraMie University
March 28, 2025
Teaching LLMs Music Theory with In-Context Learning and Chain-of-Thought Prompting: Pedagogical Strategies for Machines
Liam Pond, Ichiro FujinagaMcGill UniversityGenerative Reliability-Based Design Optimization Using In-Context Learning Capabilities of Large Language Models
Zhonglin Jiang, Qian Tang, Zequn WangUniversity of Electronic Science and Technology of ChinaPost-Incorporating Code Structural Knowledge into LLMs via In-Context Learning for Code Translation
Yali Du, Hui Sun, Ming LiNanjing University
March 27, 2025
March 20, 2025
Transformer-based Wireless Symbol Detection Over Fading Channels
Li Fan, Jing Yang, Cong ShenUniversity of VirginiaCorrective In-Context Learning: Evaluating Self-Correction in Large Language Models
Mario Sanz-Guerrero, Katharina von der WenseJohannes Gutenberg University Mainz●University of Colorado Boulder