Generative Question
Generative question answering (GQA) focuses on developing AI systems that can answer questions by generating answers, rather than simply extracting them from existing text. Current research emphasizes mitigating issues like hallucinations (generating factually incorrect answers) and improving the faithfulness of answers to source material, often employing techniques like retrieval-augmented generation (RAG) and novel model architectures such as transformers and diffusion models. This field is significant because it pushes the boundaries of AI's ability to understand and reason with information, with potential applications ranging from improved search engines and educational tools to more sophisticated medical diagnosis and decision support systems.
Papers
AITTI: Learning Adaptive Inclusive Token for Text-to-Image Generation
Xinyu Hou, Xiaoming Li, Chen Change Loy
Free to play: UN Trade and Development's experience with developing its own open-source Retrieval Augmented Generation Large Language Model application
Daniel Hopp
Retrieval-Augmented Generation for Generative Artificial Intelligence in Medicine
Rui Yang, Yilin Ning, Emilia Keppo, Mingxuan Liu, Chuan Hong, Danielle S Bitterman, Jasmine Chiat Ling Ong, Daniel Shu Wei Ting, Nan Liu