Conformal Prediction
Conformal prediction is a model-agnostic framework for generating prediction intervals or sets with guaranteed coverage probabilities, addressing the crucial need for reliable uncertainty quantification in machine learning. Current research focuses on improving the efficiency and robustness of conformal prediction methods, particularly for non-i.i.d. data (e.g., time series, graphs) and biased models, exploring techniques like adaptive conformal prediction, weighted conformal prediction, and score refinement to achieve this. These advancements are significant because they enhance the trustworthiness and applicability of machine learning models in high-stakes domains such as healthcare, autonomous systems, and financial modeling, where reliable uncertainty estimates are paramount for safe and informed decision-making.
Papers
An Information Theoretic Perspective on Conformal Prediction
Alvaro H. C. Correia, Fabio Valerio Massoli, Christos Louizos, Arash Behboodi
A comparative study of conformal prediction methods for valid uncertainty quantification in machine learning
Nicolas Dewolf
Conformal Prediction for Natural Language Processing: A Survey
Margarida M. Campos, António Farinhas, Chrysoula Zerva, Mário A. T. Figueiredo, André F. T. Martins
Safe POMDP Online Planning among Dynamic Agents via Adaptive Conformal Prediction
Shili Sheng, Pian Yu, David Parker, Marta Kwiatkowska, Lu Feng
Metric-guided Image Reconstruction Bounds via Conformal Prediction
Matt Y Cheung, Tucker J Netherton, Laurence E Court, Ashok Veeraraghavan, Guha Balakrishnan