Dempster Shafer Theory
Dempster-Shafer Theory (DST) is a mathematical framework for representing and combining uncertain evidence, aiming to improve decision-making in situations with incomplete or conflicting information. Current research focuses on integrating DST with deep learning models, particularly for multi-view classification and medical image segmentation, often employing novel combination rules to address limitations of the standard Dempster's rule and incorporating uncertainty quantification. This approach enhances the reliability and robustness of AI systems across various applications, from autonomous driving to medical diagnosis, by providing more trustworthy and explainable predictions.
Papers
October 29, 2024
October 13, 2024
September 10, 2024
August 16, 2024
May 23, 2024
May 10, 2024
May 4, 2024
January 28, 2024
September 12, 2023
September 1, 2023
July 18, 2023
June 6, 2023
May 20, 2023
April 28, 2023
March 24, 2023
September 12, 2022
June 23, 2022
April 25, 2022
April 5, 2022