Text Generation
Text generation research focuses on creating models that produce high-quality, coherent, and controllable text. Current efforts concentrate on improving evaluation methods (e.g., using LLMs as judges and incorporating adaptive references), enhancing controllability through techniques like divide-and-conquer strategies and prompt engineering, and addressing challenges such as hallucinations and memorization through various decoding strategies and knowledge integration. These advancements have significant implications for diverse applications, including clinical documentation, scientific writing, and creative content generation, while also raising important ethical considerations regarding bias, safety, and responsible use.
503papers
Papers - Page 12
June 1, 2024
May 24, 2024
AMGPT: a Large Language Model for Contextual Querying in Additive Manufacturing
Achuth Chandrasekhar, Jonathan Chan, Francis Ogoke, Olabode Ajenifujah, Amir Barati FarimaniText Generation: A Systematic Literature Review of Tasks, Evaluation, and Challenges
Jonas Becker, Jan Philip Wahle, Bela Gipp, Terry RuasLinearly Controlled Language Generation with Performative Guarantees
Emily Cheng, Marco Baroni, Carmen Amo AlonsoEmbedding-Aligned Language Models
Guy Tennenholtz, Yinlam Chow, Chih-Wei Hsu, Lior Shani, Ethan Liang, Craig BoutilierAthena: Efficient Block-Wise Post-Training Quantization for Large Language Models Using Second-Order Matrix Derivative Information
Yanshu Wang, Wenyang He, Tong Yang
May 21, 2024
OLAPH: Improving Factuality in Biomedical Long-form Question Answering
Minbyul Jeong, Hyeon Hwang, Chanwoong Yoon, Taewhoo Lee, Jaewoo KangExploration of Masked and Causal Language Modelling for Text Generation
Nicolo Micheletti, Samuel Belkadi, Lifeng Han, Goran NenadicCustomText: Customized Textual Image Generation using Diffusion Models
Shubham Paliwal, Arushi Jain, Monika Sharma, Vikram Jamwal, Lovekesh Vig