Evidential Reasoning
Evidential reasoning focuses on integrating uncertain information from multiple sources to reach reliable conclusions, addressing challenges in areas like semantic mapping, medical diagnosis, and fact verification. Current research emphasizes developing robust algorithms, such as those based on Dempster-Shafer theory and adaptive Lasso methods, to handle conflicting evidence and improve uncertainty quantification within deep learning models and large language models. This work is significant for enhancing the reliability and trustworthiness of AI systems across diverse applications, particularly in domains requiring high levels of accuracy and accountability.
Papers
October 30, 2024
May 10, 2024
November 19, 2023
November 10, 2023
October 23, 2023
October 19, 2023
May 20, 2023
February 26, 2023
September 7, 2022
April 19, 2022