Dialogue Generation
Dialogue generation focuses on creating natural and engaging conversational agents, aiming to improve the fluency, coherence, and personalization of AI-driven conversations. Current research emphasizes mitigating limitations like hallucinations and biases, improving efficiency through techniques like knowledge distillation and retrieval-augmented generation, and enhancing personalization using various model architectures including LLMs, diffusion models, and encoder-decoder models. These advancements have significant implications for various applications, including chatbots, virtual assistants, and therapeutic AI, improving human-computer interaction and potentially impacting fields like mental health support and education.
Papers
Does Collaborative Human-LM Dialogue Generation Help Information Extraction from Human Dialogues?
Bo-Ru Lu, Nikita Haduong, Chia-Hsuan Lee, Zeqiu Wu, Hao Cheng, Paul Koester, Jean Utke, Tao Yu, Noah A. Smith, Mari Ostendorf
DecompEval: Evaluating Generated Texts as Unsupervised Decomposed Question Answering
Pei Ke, Fei Huang, Fei Mi, Yasheng Wang, Qun Liu, Xiaoyan Zhu, Minlie Huang
RefGPT: Dialogue Generation of GPT, by GPT, and for GPT
Dongjie Yang, Ruifeng Yuan, Yuantao Fan, Yifei Yang, Zili Wang, Shusen Wang, Hai Zhao
Dolphin: A Challenging and Diverse Benchmark for Arabic NLG
El Moatez Billah Nagoudi, AbdelRahim Elmadany, Ahmed El-Shangiti, Muhammad Abdul-Mageed
Cross-lingual Data Augmentation for Document-grounded Dialog Systems in Low Resource Languages
Qi Gou, Zehua Xia, Wenzhe Du