Text Simplification
Text simplification aims to rewrite complex texts into easier-to-understand versions while preserving meaning, primarily benefiting individuals with cognitive impairments or limited literacy. Current research heavily utilizes large language models (LLMs), such as T5, BART, and GPT variants, often employing techniques like fine-tuning, prompt engineering, and constrained decoding to improve simplification accuracy and control readability. This field is crucial for enhancing accessibility to information across various domains, from medical reports to educational materials, and ongoing work focuses on developing better evaluation metrics and addressing challenges like information loss and the need for diverse, high-quality training data.
Papers
November 11, 2024
October 28, 2024
October 12, 2024
September 30, 2024
September 28, 2024
September 10, 2024
August 7, 2024
July 29, 2024
July 25, 2024
June 27, 2024
June 5, 2024
May 20, 2024
April 10, 2024
April 8, 2024
April 4, 2024