Intrinsic Out of Distribution Detection

Intrinsic out-of-distribution (OOD) detection focuses on improving a model's ability to identify data points that differ significantly from its training distribution, without relying on external OOD data. Current research explores various approaches, including modifying training loss functions (e.g., using Mahalanobis distance), adapting existing models like random forests and neural networks (e.g., through conformal prediction or ensemble methods), and leveraging self-supervised learning techniques. Successfully addressing this challenge is crucial for deploying machine learning models safely in real-world scenarios, enhancing robustness and reliability across diverse applications, such as autonomous driving and medical image analysis.

Papers