Conformal Meta Learner
Conformal meta-learners combine the strengths of conformal prediction, which provides reliable uncertainty estimates for individual predictions, with meta-learning techniques that improve model generalization across diverse tasks. Current research focuses on applying this framework to various problems, including individual treatment effect prediction and multi-sensor data fusion, often employing algorithms like doubly-robust learners and weighted conformal predictive systems. This approach is significant because it offers a principled way to quantify uncertainty in complex machine learning models, leading to more trustworthy and reliable predictions in high-stakes applications such as medical diagnosis and autonomous systems.
Papers
March 18, 2024
February 19, 2024
February 7, 2024
August 28, 2023
October 30, 2022