Text Generation
Text generation research focuses on creating models that produce high-quality, coherent, and controllable text. Current efforts concentrate on improving evaluation methods (e.g., using LLMs as judges and incorporating adaptive references), enhancing controllability through techniques like divide-and-conquer strategies and prompt engineering, and addressing challenges such as hallucinations and memorization through various decoding strategies and knowledge integration. These advancements have significant implications for diverse applications, including clinical documentation, scientific writing, and creative content generation, while also raising important ethical considerations regarding bias, safety, and responsible use.
Papers
NGEP: A Graph-based Event Planning Framework for Story Generation
Chen Tang, Zhihao Zhang, Tyler Loakman, Chenghua Lin, Frank Guerin
BioGPT: Generative Pre-trained Transformer for Biomedical Text Generation and Mining
Renqian Luo, Liai Sun, Yingce Xia, Tao Qin, Sheng Zhang, Hoifung Poon, Tie-Yan Liu
Language Detoxification with Attribute-Discriminative Latent Space
Jin Myung Kwak, Minseon Kim, Sung Ju Hwang