External Knowledge

External knowledge integration aims to enhance large language models (LLMs) by supplementing their internal knowledge with information from external sources like knowledge graphs and web data. Current research focuses on improving retrieval methods, particularly within Retrieval Augmented Generation (RAG) frameworks, and developing techniques to mitigate issues like "shortcutting" (over-reliance on retrieved context) and hallucinations (fabricating information). This work is crucial for increasing the accuracy, reliability, and explainability of LLMs across diverse applications, ranging from question answering and medical diagnosis to robotics and fact-checking.

Papers