Context Dependent Question
Context-dependent question answering (CDQA) focuses on developing systems that accurately answer questions requiring understanding of both the question itself and its surrounding context. Current research emphasizes improving retrieval-augmented generation (RAG) methods, often employing techniques like determinantal point processes to select diverse and non-conflicting information sources, and exploring the use of large language models (LLMs) for question generation and answer evaluation. This field is crucial for advancing natural language understanding and has significant implications for applications such as education, healthcare, and information retrieval, particularly in scenarios with complex or ambiguous queries.
Papers
Query Expansion Using Contextual Clue Sampling with Language Models
Linqing Liu, Minghan Li, Jimmy Lin, Sebastian Riedel, Pontus Stenetorp
Few-Shot Visual Question Generation: A Novel Task and Benchmark Datasets
Anurag Roy, David Johnson Ekka, Saptarshi Ghosh, Abir Das
ComSearch: Equation Searching with Combinatorial Strategy for Solving Math Word Problems with Weak Supervision
Qianying Liu, Wenyu Guan, Jianhao Shen, Fei Cheng, Sadao Kurohashi