Dialogue Generation
Dialogue generation focuses on creating natural and engaging conversational agents, aiming to improve the fluency, coherence, and personalization of AI-driven conversations. Current research emphasizes mitigating limitations like hallucinations and biases, improving efficiency through techniques like knowledge distillation and retrieval-augmented generation, and enhancing personalization using various model architectures including LLMs, diffusion models, and encoder-decoder models. These advancements have significant implications for various applications, including chatbots, virtual assistants, and therapeutic AI, improving human-computer interaction and potentially impacting fields like mental health support and education.
Papers
Learning to Express in Knowledge-Grounded Conversation
Xueliang Zhao, Tingchen Fu, Chongyang Tao, Wei Wu, Dongyan Zhao, Rui Yan
Stylized Knowledge-Grounded Dialogue Generation via Disentangled Template Rewriting
Qingfeng Sun, Can Xu, Huang Hu, Yujing Wang, Jian Miao, Xiubo Geng, Yining Chen, Fei Xu, Daxin Jiang