Conformal Method
Conformal methods provide a powerful framework for quantifying uncertainty in predictions, offering finite-sample validity guarantees under minimal assumptions about the data distribution. Current research focuses on extending conformal prediction to non-exchangeable data, such as time series, and adapting it for various tasks including parameter estimation, anomaly detection, and even mitigating large language model hallucinations. This robust approach is increasingly significant for building trustworthy AI systems across diverse applications, from robotics and industrial simulations to safety-critical domains where reliable uncertainty quantification is paramount.
Papers
October 21, 2024
September 23, 2024
June 24, 2024
May 28, 2024
May 10, 2024
April 4, 2024
February 26, 2024
February 14, 2024
February 3, 2024
January 15, 2024
October 2, 2023
July 3, 2023
April 12, 2023
September 8, 2022
August 23, 2022
August 20, 2022
August 14, 2022