Advocating for a Combined Use of OOD Detection and Conformal Prediction
by
Paul Novello(IRT Saint Exupéry, INSA Toulouse)
→
Europe/Paris
Salle K. Johnson (1R3, 1er étage)
Salle K. Johnson
1R3, 1er étage
Description
Research on Out-Of-Distribution (OOD) detection focuses mainly on building scores that efficiently distinguish OOD data from In Distribution (ID) data. On the other hand, Conformal Prediction (CP) uses non-conformity scores to construct prediction sets with probabilistic coverage guarantees. In other words, the former designs scores, while the latter designs probabilistic guarantees based on scores. This position paper argues that these two fields exhibit some potentially impactful synergies. We defend this position by formalizing this link and emphasizing the benefits of considering this link in both OOD detection and the CP fields. First, for OOD detection, we show that in standard OOD benchmark settings, evaluation metrics can be affected by the validation dataset's finite sample size. Extending the work of Bates et al. 2022, we define new conformal AUROC and conformal FPR@TPR95 metrics, which are corrections that provide probabilistic guarantees on the variability of the FPR involved in these metrics with respect to the validation datasets. We show the effect of these corrections on two reference OOD and anomaly detection benchmarks, OpenOOD (Yang et al. 2022), and ADBench (Han et al. 2022). Second, for CP, we explore using OOD scores as non-conformity scores and show that they can improve the efficiency of the prediction sets obtained with CP.