Intrinsic Out of Distribution Detection
Intrinsic out-of-distribution (OOD) detection focuses on improving a model's ability to identify data points that differ significantly from its training distribution, without relying on external OOD data. Current research explores various approaches, including modifying training loss functions (e.g., using Mahalanobis distance), adapting existing models like random forests and neural networks (e.g., through conformal prediction or ensemble methods), and leveraging self-supervised learning techniques. Successfully addressing this challenge is crucial for deploying machine learning models safely in real-world scenarios, enhancing robustness and reliability across diverse applications, such as autonomous driving and medical image analysis.
Papers
June 4, 2024
May 25, 2024
December 7, 2023
November 1, 2023
June 13, 2023
June 6, 2023