Knowledge Based
Knowledge-based systems research focuses on effectively integrating and utilizing knowledge within artificial intelligence, primarily aiming to improve the accuracy, reliability, and interpretability of AI models. Current research emphasizes enhancing large language models (LLMs) with external knowledge graphs, employing techniques like retrieval-augmented generation and knowledge distillation to overcome limitations such as hallucinations and catastrophic forgetting. This work is significant because it addresses critical challenges in AI, leading to more robust and trustworthy systems with applications in diverse fields like education, healthcare, and materials science.
Papers
Knowledge Distillation-Empowered Digital Twin for Anomaly Detection
Qinghua Xu, Shaukat Ali, Tao Yue, Zaimovic Nedim, Inderjeet Singh
Viewing the process of generating counterfactuals as a source of knowledge: a new approach for explaining classifiers
Vincent Lemaire, Nathan Le Boudec, Victor Guyomard, Françoise Fessant