Generative Question
Generative question answering (GQA) focuses on developing AI systems that can answer questions by generating answers, rather than simply extracting them from existing text. Current research emphasizes mitigating issues like hallucinations (generating factually incorrect answers) and improving the faithfulness of answers to source material, often employing techniques like retrieval-augmented generation (RAG) and novel model architectures such as transformers and diffusion models. This field is significant because it pushes the boundaries of AI's ability to understand and reason with information, with potential applications ranging from improved search engines and educational tools to more sophisticated medical diagnosis and decision support systems.
Papers
qecGPT: decoding Quantum Error-correcting Codes with Generative Pre-trained Transformers
Hanyan Cao, Feng Pan, Yijia Wang, Pan Zhang
Development of the ChatGPT, Generative Artificial Intelligence and Natural Large Language Models for Accountable Reporting and Use (CANGARU) Guidelines
Giovanni E. Cacciamani, Michael B. Eppler, Conner Ganjavi, Asli Pekan, Brett Biedermann, Gary S. Collins, Inderbir S. Gill