Epistemic Uncertainty Quantification
Epistemic uncertainty quantification (EUQ) aims to measure a model's confidence in its predictions, distinguishing it from aleatoric uncertainty (inherent randomness in the data). Current research focuses on improving EUQ methods for various model architectures, including Bayesian neural networks and pre-trained models, often employing gradient-based approaches or techniques like evidential deep learning to estimate uncertainty. Accurate EUQ is crucial for building trustworthy AI systems, particularly in high-stakes applications like healthcare and autonomous driving, by enabling better decision-making and improved out-of-distribution detection.
Papers
October 7, 2024
July 2, 2024
June 4, 2024
April 15, 2024
February 20, 2024
February 14, 2024
February 13, 2024
August 15, 2023
January 30, 2023
November 16, 2022
November 11, 2022
August 22, 2022
July 22, 2022
March 11, 2022