Open Domain Question Answering
Open-domain question answering (ODQA) aims to build systems capable of answering factual questions using vast, unstructured knowledge sources. Current research heavily focuses on improving retrieval methods, particularly dense retrieval, and enhancing the integration of large language models (LLMs) within retrieval-augmented generation (RAG) frameworks, including exploring techniques like in-context learning and adaptive retrieval strategies to handle noisy or incomplete information. These advancements are crucial for improving the accuracy and efficiency of ODQA systems, with significant implications for applications ranging from conversational AI to information access and knowledge discovery.
Papers
Steering Knowledge Selection Behaviours in LLMs via SAE-Based Representation Engineering
Yu Zhao, Alessio Devoto, Giwon Hong, Xiaotang Du, Aryo Pradipta Gema, Hongru Wang, Kam-Fai Wong, Pasquale Minervini
RAG4ITOps: A Supervised Fine-Tunable and Comprehensive RAG Framework for IT Operations and Maintenance
Tianyang Zhang, Zhuoxuan Jiang, Shengguang Bai, Tianrui Zhang, Lin Lin, Yang Liu, Jiawei Ren
Improve Dense Passage Retrieval with Entailment Tuning
Lu Dai, Hao Liu, Hui Xiong