Text Generation
Text generation research focuses on creating models that produce high-quality, coherent, and controllable text. Current efforts concentrate on improving evaluation methods (e.g., using LLMs as judges and incorporating adaptive references), enhancing controllability through techniques like divide-and-conquer strategies and prompt engineering, and addressing challenges such as hallucinations and memorization through various decoding strategies and knowledge integration. These advancements have significant implications for diverse applications, including clinical documentation, scientific writing, and creative content generation, while also raising important ethical considerations regarding bias, safety, and responsible use.
Papers
LEATHER: A Framework for Learning to Generate Human-like Text in Dialogue
Anthony Sicilia, Malihe Alikhani
BERTScore is Unfair: On Social Bias in Language Model-Based Metrics for Text Generation
Tianxiang Sun, Junliang He, Xipeng Qiu, Xuanjing Huang
PCFG-based Natural Language Interface Improves Generalization for Controlled Text Generation
Jingyu Zhang, James Glass, Tianxing He
Towards a Unified Multi-Dimensional Evaluator for Text Generation
Ming Zhong, Yang Liu, Da Yin, Yuning Mao, Yizhu Jiao, Pengfei Liu, Chenguang Zhu, Heng Ji, Jiawei Han
Scaling Back-Translation with Domain Text Generation for Sign Language Gloss Translation
Jinhui Ye, Wenxiang Jiao, Xing Wang, Zhaopeng Tu
The COVID That Wasn't: Counterfactual Journalism Using GPT
Sil Hamilton, Andrew Piper
Visualize Before You Write: Imagination-Guided Open-Ended Text Generation
Wanrong Zhu, An Yan, Yujie Lu, Wenda Xu, Xin Eric Wang, Miguel Eckstein, William Yang Wang
How Large Language Models are Transforming Machine-Paraphrased Plagiarism
Jan Philip Wahle, Terry Ruas, Frederic Kirstein, Bela Gipp
PCAE: A Framework of Plug-in Conditional Auto-Encoder for Controllable Text Generation
Haoqin Tu, Zhongliang Yang, Jinshuai Yang, Siyu Zhang, Yongfeng Huang
A Unified Encoder-Decoder Framework with Entity Memory
Zhihan Zhang, Wenhao Yu, Chenguang Zhu, Meng Jiang
FAST: Improving Controllability for Text Generation with Feedback Aware Self-Training
Junyi Chai, Reid Pryzant, Victor Ye Dong, Konstantin Golobokov, Chenguang Zhu, Yi Liu
A Distributional Lens for Multi-Aspect Controllable Text Generation
Yuxuan Gu, Xiaocheng Feng, Sicheng Ma, Lingyuan Zhang, Heng Gong, Bing Qin