Conformal Prediction
Conformal prediction is a model-agnostic framework for generating prediction intervals or sets with guaranteed coverage probabilities, addressing the crucial need for reliable uncertainty quantification in machine learning. Current research focuses on improving the efficiency and robustness of conformal prediction methods, particularly for non-i.i.d. data (e.g., time series, graphs) and biased models, exploring techniques like adaptive conformal prediction, weighted conformal prediction, and score refinement to achieve this. These advancements are significant because they enhance the trustworthiness and applicability of machine learning models in high-stakes domains such as healthcare, autonomous systems, and financial modeling, where reliable uncertainty estimates are paramount for safe and informed decision-making.
Papers
Conformal Prediction Sets Can Cause Disparate Impact
Jesse C. Cresswell, Bhargava Kumar, Yi Sui, Mouloud Belbahri
Decision-Focused Uncertainty Quantification
Santiago Cortes-Gomez, Carlos Patiño, Yewon Byun, Steven Wu, Eric Horvitz, Bryan Wilder
Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering
Klaus-Rudolf Kladny, Bernhard Schölkopf, Michael Muehlebach
Benchmarking Graph Conformal Prediction: Empirical Analysis, Scalability, and Theoretical Insights
Pranav Maneriker, Aditya T. Vadlamani, Anutam Srinivasan, Yuntian He, Ali Payani, Srinivasan Parthasarathy
Adjusting Regression Models for Conditional Uncertainty Calibration
Ruijiang Gao, Mingzhang Yin, James McInerney, Nathan Kallus