Open Domain Question Answering
Open-domain question answering (ODQA) aims to build systems capable of answering factual questions using vast, unstructured knowledge sources. Current research heavily focuses on improving retrieval methods, particularly dense retrieval, and enhancing the integration of large language models (LLMs) within retrieval-augmented generation (RAG) frameworks, including exploring techniques like in-context learning and adaptive retrieval strategies to handle noisy or incomplete information. These advancements are crucial for improving the accuracy and efficiency of ODQA systems, with significant implications for applications ranging from conversational AI to information access and knowledge discovery.
Papers
Expand, Rerank, and Retrieve: Query Reranking for Open-Domain Question Answering
Yung-Sung Chuang, Wei Fang, Shang-Wen Li, Wen-tau Yih, James Glass
Exploiting Abstract Meaning Representation for Open-Domain Question Answering
Cunxiang Wang, Zhikun Xu, Qipeng Guo, Xiangkun Hu, Xuefeng Bai, Zheng Zhang, Yue Zhang
RFiD: Towards Rational Fusion-in-Decoder for Open-Domain Question Answering
Cunxiang Wang, Haofei Yu, Yue Zhang
Exploring Contrast Consistency of Open-Domain Question Answering Systems on Minimally Edited Questions
Zhihan Zhang, Wenhao Yu, Zheng Ning, Mingxuan Ju, Meng Jiang
IfQA: A Dataset for Open-domain Question Answering under Counterfactual Presuppositions
Wenhao Yu, Meng Jiang, Peter Clark, Ashish Sabharwal
On the Risk of Misinformation Pollution with Large Language Models
Yikang Pan, Liangming Pan, Wenhu Chen, Preslav Nakov, Min-Yen Kan, William Yang Wang
Not All Languages Are Created Equal in LLMs: Improving Multilingual Capability by Cross-Lingual-Thought Prompting
Haoyang Huang, Tianyi Tang, Dongdong Zhang, Wayne Xin Zhao, Ting Song, Yan Xia, Furu Wei
Evaluating Open-Domain Question Answering in the Era of Large Language Models
Ehsan Kamalloo, Nouha Dziri, Charles L. A. Clarke, Davood Rafiei