Conformal Score
Conformal prediction (CP) is a framework for quantifying uncertainty in machine learning predictions by constructing prediction sets guaranteed to contain the true value with a specified probability. Current research focuses on improving the efficiency and robustness of CP, particularly by developing novel conformal scores that are less sensitive to label noise, distribution shifts, and imbalanced datasets, and by adapting CP to various settings like regression, multi-user scenarios, and graph-structured data. These advancements enhance the reliability and practical applicability of CP across diverse domains, offering valuable tools for trustworthy AI and decision-making in high-stakes applications.
Papers
June 10, 2024
May 4, 2024
March 22, 2024
February 12, 2024
December 8, 2023
November 1, 2023
September 4, 2023
June 15, 2023
May 31, 2023
November 26, 2022
October 6, 2022