Knowledge Memory
Knowledge memory in artificial intelligence focuses on enabling AI systems, particularly large language models (LLMs), to effectively store, retrieve, and utilize external knowledge to enhance reasoning and response generation. Current research emphasizes efficient knowledge integration methods, such as internalizing symbolic knowledge or employing retrieval-augmented generation (RAG) systems with modules for query rewriting, knowledge filtering, and memory management. These advancements aim to improve the accuracy, efficiency, and robustness of AI systems across various tasks, including question answering, image captioning, and complex reasoning over knowledge graphs, ultimately leading to more reliable and versatile AI applications.
Papers
September 20, 2024
September 7, 2024
July 15, 2024
May 23, 2024
March 17, 2024
February 17, 2024
November 27, 2023
October 11, 2023
May 1, 2023