Paper ID: 2207.08200
Uncertainty Calibration in Bayesian Neural Networks via Distance-Aware Priors
Gianluca Detommaso, Alberto Gasparin, Andrew Wilson, Cedric Archambeau
As we move away from the data, the predictive uncertainty should increase, since a great variety of explanations are consistent with the little available information. We introduce Distance-Aware Prior (DAP) calibration, a method to correct overconfidence of Bayesian deep learning models outside of the training domain. We define DAPs as prior distributions over the model parameters that depend on the inputs through a measure of their distance from the training set. DAP calibration is agnostic to the posterior inference method, and it can be performed as a post-processing step. We demonstrate its effectiveness against several baselines in a variety of classification and regression problems, including benchmarks designed to test the quality of predictive distributions away from the data.
Submitted: Jul 17, 2022