Quality Text

High-quality text generation is a rapidly evolving field focused on creating coherent, relevant, and engaging text using various computational methods. Current research emphasizes improving the diversity and fluency of generated text, often employing large language models (LLMs) and techniques like minimum Bayes risk decoding and contrastive decoding to optimize for both likelihood and human preference. This work is crucial for mitigating the risks of malicious applications like disinformation while advancing applications in areas such as summarization, translation, and creative writing, ultimately impacting fields ranging from journalism to education.

Papers