Context Size
Context size in large language models (LLMs) refers to the amount of input text a model can process simultaneously, impacting its ability to understand and generate coherent text. Current research focuses on improving LLMs' utilization of larger contexts, investigating how models balance contextual information with their internal knowledge, and developing techniques to mitigate issues like hallucinations and information leakage. This research is crucial for advancing LLMs' capabilities in various applications, including summarization, question answering, and code generation, by enabling more nuanced understanding of complex, lengthy inputs.
Papers
October 3, 2024
September 13, 2024
July 23, 2024
April 9, 2024
March 26, 2024
January 13, 2024
December 3, 2023
November 20, 2023
September 21, 2023
June 4, 2023
March 23, 2023