Uncertainty Quantification
Uncertainty quantification (UQ) aims to assess and represent the confidence in predictions made by machine learning models, crucial for high-stakes applications where reliable predictions are paramount. Current research focuses on developing robust UQ methods, particularly addressing biases in predictions and efficiently quantifying uncertainty in large language models and deep neural networks, often employing techniques like conformal prediction, Bayesian methods, and ensemble learning. The ability to reliably quantify uncertainty enhances the trustworthiness and applicability of machine learning across diverse fields, from healthcare diagnostics and autonomous driving to climate modeling and drug discovery.
Papers
Towards Reproducible LLM Evaluation: Quantifying Uncertainty in LLM Benchmark Scores
Robert E. Blackwell, Jon Barry, Anthony G. Cohn
On Uncertainty In Natural Language Processing
Dennis Ulmer
Lightning UQ Box: A Comprehensive Framework for Uncertainty Quantification in Deep Learning
Nils Lehmann, Jakob Gawlikowski, Adam J. Stewart, Vytautas Jancauskas, Stefan Depeweg, Eric Nalisnick, Nina Maria Gottschling
Decision-Focused Uncertainty Quantification
Santiago Cortes-Gomez, Carlos Patiño, Yewon Byun, Steven Wu, Eric Horvitz, Bryan Wilder
Uncertainty Quantification with Bayesian Higher Order ReLU KANs
James Giroux, Cristiano Fanelli
Entropy-Based Uncertainty Modeling for Trajectory Prediction in Autonomous Driving
Aron Distelzweig, Andreas Look, Eitan Kosman, Faris Janjoš, Jörg Wagner, Abhinav Valadaa
Future-Proofing Medical Imaging with Privacy-Preserving Federated Learning and Uncertainty Quantification: A Review
Nikolas Koutsoubis, Asim Waqas, Yasin Yilmaz, Ravi P. Ramachandran, Matthew Schabath, Ghulam Rasool
AUGUR, A flexible and efficient optimization algorithm for identification of optimal adsorption sites
Ioannis Kouroudis, Poonam, Neel Misciaci, Felix Mayr, Leon Müller, Zhaosu Gu, Alessio Gagliardi
Stratospheric aerosol source inversion: Noise, variability, and uncertainty quantification
J. Hart, I. Manickam, M. Gulian, L. Swiler, D. Bull, T. Ehrmann, H. Brown, B. Wagman, J. Watkins
A Primer on Variational Inference for Physics-Informed Deep Generative Modelling
Alex Glyn-Davies, Arnaud Vadeboncoeur, O. Deniz Akyildiz, Ieva Kazlauskaite, Mark Girolami
CoDiCast: Conditional Diffusion Model for Weather Prediction with Uncertainty Quantification
Jimeng Shi, Bowen Jin, Jiawei Han, Giri Narasimhan
Predicting Critical Heat Flux with Uncertainty Quantification and Domain Generalization Using Conditional Variational Autoencoders and Deep Neural Networks
Farah Alsafadi, Aidan Furlong, Xu Wu