Paraphrase Generation
Paraphrase generation, the task of rewriting text while preserving meaning, is a core area of natural language processing research focused on improving both the quality and diversity of generated text. Current research emphasizes leveraging large language models (LLMs) and diffusion models, often incorporating techniques like knowledge distillation, in-context learning, and syntactic control to enhance generation capabilities and address challenges such as hallucination and maintaining semantic consistency. This field is crucial for applications ranging from improving the accessibility of complex texts to mitigating the spread of harmful or misleading AI-generated content, and its advancements are driving progress in various NLP tasks.
Papers
November 1, 2024
October 31, 2024
August 19, 2024
July 23, 2024
July 1, 2024
June 16, 2024
May 18, 2024
May 1, 2024
April 19, 2024
April 18, 2024
April 17, 2024
April 13, 2024
April 3, 2024
March 12, 2024
February 23, 2024
February 16, 2024
January 31, 2024
November 25, 2023
October 24, 2023