Text Generation
Text generation research focuses on creating models that produce high-quality, coherent, and controllable text. Current efforts concentrate on improving evaluation methods (e.g., using LLMs as judges and incorporating adaptive references), enhancing controllability through techniques like divide-and-conquer strategies and prompt engineering, and addressing challenges such as hallucinations and memorization through various decoding strategies and knowledge integration. These advancements have significant implications for diverse applications, including clinical documentation, scientific writing, and creative content generation, while also raising important ethical considerations regarding bias, safety, and responsible use.
Papers
Quark: Controllable Text Generation with Reinforced Unlearning
Ximing Lu, Sean Welleck, Jack Hessel, Liwei Jiang, Lianhui Qin, Peter West, Prithviraj Ammanabrolu, Yejin Choi
Keywords and Instances: A Hierarchical Contrastive Learning Framework Unifying Hybrid Granularities for Text Generation
Mingzhe Li, XieXiong Lin, Xiuying Chen, Jinxiong Chang, Qishen Zhang, Feng Wang, Taifeng Wang, Zhongyi Liu, Wei Chu, Dongyan Zhao, Rui Yan
RSTGen: Imbuing Fine-Grained Interpretable Control into Long-FormText Generators
Rilwan A. Adewoyin, Ritabrata Dutta, Yulan He
Gradient-Based Constrained Sampling from Language Models
Sachin Kumar, Biswajit Paria, Yulia Tsvetkov
R2D2: Robust Data-to-Text with Replacement Detection
Linyong Nan, Lorenzo Jaime Yu Flores, Yilun Zhao, Yixin Liu, Luke Benson, Weijin Zou, Dragomir Radev
Medical Scientific Table-to-Text Generation with Human-in-the-Loop under the Data Sparsity Constraint
Heng-Yi Wu, Jingqing Zhang, Julia Ive, Tong Li, Vibhor Gupta, Bingyuan Chen, Yike Guo
On Advances in Text Generation from Images Beyond Captioning: A Case Study in Self-Rationalization
Shruti Palaskar, Akshita Bhagia, Yonatan Bisk, Florian Metze, Alan W Black, Ana Marasović
A Self-Paced Mixed Distillation Method for Non-Autoregressive Generation
Weizhen Qi, Yeyun Gong, Yelong Shen, Jian Jiao, Yu Yan, Houqiang Li, Ruofei Zhang, Weizhu Chen, Nan Duan
TempLM: Distilling Language Models into Template-Based Generators
Tianyi Zhang, Mina Lee, Lisa Li, Ende Shen, Tatsunori B. Hashimoto