Conformal Score

Conformal prediction (CP) is a framework for quantifying uncertainty in machine learning predictions by constructing prediction sets guaranteed to contain the true value with a specified probability. Current research focuses on improving the efficiency and robustness of CP, particularly by developing novel conformal scores that are less sensitive to label noise, distribution shifts, and imbalanced datasets, and by adapting CP to various settings like regression, multi-user scenarios, and graph-structured data. These advancements enhance the reliability and practical applicability of CP across diverse domains, offering valuable tools for trustworthy AI and decision-making in high-stakes applications.

Papers