Conformal Prediction
Conformal prediction is a model-agnostic framework for generating prediction intervals or sets with guaranteed coverage probabilities, addressing the crucial need for reliable uncertainty quantification in machine learning. Current research focuses on improving the efficiency and robustness of conformal prediction methods, particularly for non-i.i.d. data (e.g., time series, graphs) and biased models, exploring techniques like adaptive conformal prediction, weighted conformal prediction, and score refinement to achieve this. These advancements are significant because they enhance the trustworthiness and applicability of machine learning models in high-stakes domains such as healthcare, autonomous systems, and financial modeling, where reliable uncertainty estimates are paramount for safe and informed decision-making.
Papers
Conformal Recursive Feature Elimination
Marcos López-De-Castro, Alberto García-Galindo, Rubén Armañanzas
Valid Conformal Prediction for Dynamic GNNs
Ed Davis, Ian Gallagher, Daniel John Lawson, Patrick Rubin-Delanchy
Verifiably Robust Conformal Prediction
Linus Jeary, Tom Kuipers, Mehran Hosseini, Nicola Paoletti
Conformal Depression Prediction
Yonghong Li, Xiuzhuang Zhou
Towards Human-AI Complementarity with Predictions Sets
Giovanni De Toni, Nastaran Okati, Suhas Thejaswi, Eleni Straitouri, Manuel Gomez-Rodriguez
Kernel-based optimally weighted conformal prediction intervals
Jonghyeok Lee, Chen Xu, Yao Xie
CHAMP: Conformalized 3D Human Multi-Hypothesis Pose Estimators
Harry Zhang, Luca Carlone