Abstractive Text Summarization
Abstractive text summarization aims to generate concise, coherent summaries that capture the essence of a longer text, going beyond simply extracting sentences. Current research focuses on improving the accuracy and fluency of these summaries using large language models (LLMs) and sequence-to-sequence models, often incorporating techniques like attention mechanisms, hierarchical structures, and data augmentation to address challenges such as hallucination and factual inconsistency. This field is significant because effective summarization is crucial for efficient information retrieval and comprehension across diverse domains, from news articles to scientific literature and legal documents.
Papers
October 21, 2024
October 13, 2024
October 3, 2024
September 4, 2024
July 10, 2024
June 6, 2024
April 8, 2024
March 24, 2024
February 19, 2024
December 10, 2023
November 9, 2023
October 16, 2023
October 12, 2023
August 24, 2023
July 5, 2023
July 3, 2023
June 24, 2023
June 23, 2023
May 29, 2023
May 23, 2023