Conformal Prediction
Conformal prediction is a model-agnostic framework for generating prediction intervals or sets with guaranteed coverage probabilities, addressing the crucial need for reliable uncertainty quantification in machine learning. Current research focuses on improving the efficiency and robustness of conformal prediction methods, particularly for non-i.i.d. data (e.g., time series, graphs) and biased models, exploring techniques like adaptive conformal prediction, weighted conformal prediction, and score refinement to achieve this. These advancements are significant because they enhance the trustworthiness and applicability of machine learning models in high-stakes domains such as healthcare, autonomous systems, and financial modeling, where reliable uncertainty estimates are paramount for safe and informed decision-making.
Papers
Addressing Uncertainty in LLMs to Enhance Reliability in Generative AI
Ramneet Kaur, Colin Samplawski, Adam D. Cobb, Anirban Roy, Brian Matejek, Manoj Acharya, Daniel Elenius, Alexander M. Berenbeim, John A. Pavlik, Nathaniel D. Bastian, Susmit Jha
Conformal-in-the-Loop for Learning with Imbalanced Noisy Data
John Brandon Graham-Knight, Jamil Fayyad, Nourhan Bayasi, Patricia Lasserre, Homayoun Najjaran
Semiparametric conformal prediction
Ji Won Park, Robert Tibshirani, Kyunghyun Cho
Conformalized Prediction of Post-Fault Voltage Trajectories Using Pre-trained and Finetuned Attention-Driven Neural Operators
Amirhossein Mollaali, Gabriel Zufferey, Gonzalo Constante-Flores, Christian Moya, Can Li, Guang Lin, Meng Yue
Conformal prediction of circular data
Paulo C. Marques F., Rinaldo Artes, Helton Graziadei