Neural Text Generation
Neural text generation aims to create human-quality text using artificial neural networks, focusing on improving fluency, diversity, and controllability of the generated output. Current research emphasizes developing more efficient decoding algorithms (like contrastive search and best-k search) and incorporating constraints (lexical, structural, or relation-based) to guide generation, often leveraging large language models. These advancements are significant for various applications, including machine translation, summarization, and dialogue systems, by enabling more accurate, diverse, and controllable text generation.
Papers
November 12, 2024
October 25, 2023
September 22, 2023
July 19, 2023
June 20, 2023
May 31, 2023
May 22, 2023
May 19, 2023
January 31, 2023
November 22, 2022
October 26, 2022
October 25, 2022
June 6, 2022
May 29, 2022
March 29, 2022
March 19, 2022
March 18, 2022
March 6, 2022
February 13, 2022
December 16, 2021