Robust Conformal Prediction
Robust conformal prediction aims to create statistically guaranteed prediction sets—intervals or sets of possible outcomes—that reliably contain the true value, even when facing data imperfections or adversarial attacks. Current research focuses on adapting conformal prediction methods for various model architectures, including graph neural networks and deep learning models, and addressing challenges like distribution shifts and data contamination through techniques such as weighted conformal prediction and robust training procedures. This work is crucial for building trustworthy AI systems by providing reliable uncertainty quantification, improving the reliability of predictions in high-stakes applications, and enhancing the robustness of machine learning models to real-world data variations.