Dempster Shafer Theory

Dempster-Shafer Theory (DST) is a mathematical framework for representing and combining uncertain evidence, aiming to improve decision-making in situations with incomplete or conflicting information. Current research focuses on integrating DST with deep learning models, particularly for multi-view classification and medical image segmentation, often employing novel combination rules to address limitations of the standard Dempster's rule and incorporating uncertainty quantification. This approach enhances the reliability and robustness of AI systems across various applications, from autonomous driving to medical diagnosis, by providing more trustworthy and explainable predictions.

Papers