Paper ID: 2503.13317 • Published Mar 17, 2025
Do you understand epistemic uncertainty? Think again! Rigorous frequentist epistemic uncertainty estimation in regression
Enrico Foglia, Benjamin Bobbia, Nikita Durasov, Michael Bauerheim, Pascal Fua, Stephane Moreau, Thierry Jardin
Institut Sup ´erieur de l’A ´eronautique et de l’Espace•Universit´e de Sherbrooke•Ecole Polyt ´echnique Fed ´erale de Lausanne
TL;DR
Get AI-generated summaries with premium
Get AI-generated summaries with premium
Quantifying model uncertainty is critical for understanding prediction
reliability, yet distinguishing between aleatoric and epistemic uncertainty
remains challenging. We extend recent work from classification to regression to
provide a novel frequentist approach to epistemic and aleatoric uncertainty
estimation. We train models to generate conditional predictions by feeding
their initial output back as an additional input. This method allows for a
rigorous measurement of model uncertainty by observing how prediction responses
change when conditioned on the model's previous answer. We provide a complete
theoretical framework to analyze epistemic uncertainty in regression in a
frequentist way, and explain how it can be exploited in practice to gauge a
model's uncertainty, with minimal changes to the original architecture.
Figures & Tables
Unlock access to paper figures and tables to enhance your research experience.