External Knowledge
External knowledge integration aims to enhance large language models (LLMs) by supplementing their internal knowledge with information from external sources like knowledge graphs and web data. Current research focuses on improving retrieval methods, particularly within Retrieval Augmented Generation (RAG) frameworks, and developing techniques to mitigate issues like "shortcutting" (over-reliance on retrieved context) and hallucinations (fabricating information). This work is crucial for increasing the accuracy, reliability, and explainability of LLMs across diverse applications, ranging from question answering and medical diagnosis to robotics and fact-checking.
Papers
July 1, 2022
June 27, 2022
June 15, 2022
June 4, 2022
May 12, 2022
May 1, 2022
April 28, 2022
April 20, 2022
April 19, 2022
April 8, 2022
March 28, 2022
February 28, 2022
February 21, 2022
January 27, 2022
December 6, 2021