Generative Question
Generative question answering (GQA) focuses on developing AI systems that can answer questions by generating answers, rather than simply extracting them from existing text. Current research emphasizes mitigating issues like hallucinations (generating factually incorrect answers) and improving the faithfulness of answers to source material, often employing techniques like retrieval-augmented generation (RAG) and novel model architectures such as transformers and diffusion models. This field is significant because it pushes the boundaries of AI's ability to understand and reason with information, with potential applications ranging from improved search engines and educational tools to more sophisticated medical diagnosis and decision support systems.
Papers
Crystal Transformer: Self-learning neural language model for Generative and Tinkering Design of Materials
Lai Wei, Qinyang Li, Yuqi Song, Stanislav Stefanov, Edirisuriya M. D. Siriwardane, Fanglin Chen, Jianjun Hu
Which Discriminator for Cooperative Text Generation?
Antoine Chaffin, Thomas Scialom, Sylvain Lamprier, Jacopo Staiano, Benjamin Piwowarski, Ewa Kijak, Vincent Claveau