Conformal Coverage
Conformal prediction offers a powerful framework for constructing prediction intervals with guaranteed coverage probabilities, addressing the crucial need for reliable uncertainty quantification in machine learning. Current research focuses on extending conformal methods to diverse settings, including federated learning (handling distributed and potentially malicious data), sequential data (where data distributions evolve over time), and specific applications like object detection and natural language processing. This rigorous approach to uncertainty quantification is increasingly vital for ensuring the safety and trustworthiness of AI systems across various domains, from autonomous driving to healthcare.
Papers
June 4, 2024
May 10, 2024
March 12, 2024
February 15, 2024
December 4, 2023
September 4, 2023
December 2, 2022