Probability Calibration
Probability calibration focuses on aligning the predicted probabilities of machine learning models with their actual accuracy, ensuring that a model's confidence reflects its true predictive ability. Current research emphasizes improving calibration in diverse applications, including image segmentation, drug discovery, and natural language processing, employing techniques like post-hoc calibration methods and Bayesian approaches alongside various model architectures (e.g., convolutional neural networks, transformers). Achieving well-calibrated models is crucial for reliable uncertainty quantification, facilitating informed decision-making in high-stakes domains and improving the trustworthiness of machine learning predictions.
Papers
October 27, 2024
September 29, 2024
September 19, 2024
September 5, 2024
July 19, 2024
October 8, 2023
October 13, 2022
September 12, 2022
August 4, 2022