Paper ID: 2402.04344

Delving into temperature scaling for adaptive conformal prediction

Huajun Xi, Jianguo Huang, Kangdao Liu, Lei Feng, Hongxin Wei

Conformal prediction, as an emerging uncertainty qualification technique, constructs prediction sets that are guaranteed to contain the true label with pre-defined probability. Previous works often employ temperature scaling to calibrate the classifier, assuming that confidence calibration can benefit conformal prediction. In this work, we empirically show that current confidence calibration methods (e.g., temperature scaling) normally lead to larger prediction sets in adaptive conformal prediction. Theoretically, we prove that a prediction with higher confidence could result in a smaller prediction set on expectation. Inspired by the analysis, we propose $Conformal$ $Temperature$ $Scaling$ (ConfTS), a variant of temperature scaling that aims to improve the efficiency of adaptive conformal prediction. Specifically, ConfTS optimizes the temperature value by minimizing the gap between the threshold and the non-conformity score of the ground truth for a held-out validation dataset. In this way, the temperature value obtained would lead to an optimal set of high efficiency without violating the marginal coverage property. Extensive experiments demonstrate that our method can effectively enhance adaptive conformal prediction methods in both efficiency and conditional coverage, reducing the average size of APS and RAPS by nearly 50$\%$ on ImageNet at error rate $\alpha=0.1$.

Submitted: Feb 6, 2024