Open Domain Question Answering
Open-domain question answering (ODQA) aims to build systems capable of answering factual questions using vast, unstructured knowledge sources. Current research heavily focuses on improving retrieval methods, particularly dense retrieval, and enhancing the integration of large language models (LLMs) within retrieval-augmented generation (RAG) frameworks, including exploring techniques like in-context learning and adaptive retrieval strategies to handle noisy or incomplete information. These advancements are crucial for improving the accuracy and efficiency of ODQA systems, with significant implications for applications ranging from conversational AI to information access and knowledge discovery.
Papers
REAR: A Relevance-Aware Retrieval-Augmented Framework for Open-Domain Question Answering
Yuhao Wang, Ruiyang Ren, Junyi Li, Wayne Xin Zhao, Jing Liu, Ji-Rong Wen
Reasoning in Conversation: Solving Subjective Tasks through Dialogue Simulation for Large Language Models
Xiaolong Wang, Yile Wang, Yuanchi Zhang, Fuwen Luo, Peng Li, Maosong Sun, Yang Liu