Distribution Detection
Out-of-distribution (OOD) detection aims to identify data points that differ significantly from a machine learning model's training data, crucial for ensuring reliable and safe model deployment. Current research focuses on developing novel scoring functions and model architectures, including those based on diffusion models, variational autoencoders, and vision-language models, to improve the accuracy and efficiency of OOD detection, often addressing challenges posed by imbalanced datasets and limited access to model parameters. This field is vital for enhancing the trustworthiness of AI systems across diverse applications, from autonomous driving to medical diagnosis, by mitigating the risks associated with making predictions on unseen data. A growing emphasis is placed on developing methods that are both effective and computationally efficient, particularly for resource-constrained environments.
Papers
OpenOOD v1.5: Enhanced Benchmark for Out-of-Distribution Detection
Jingyang Zhang, Jingkang Yang, Pengyun Wang, Haoqi Wang, Yueqian Lin, Haoran Zhang, Yiyou Sun, Xuefeng Du, Yixuan Li, Ziwei Liu, Yiran Chen, Hai Li
Feed Two Birds with One Scone: Exploiting Wild Data for Both Out-of-Distribution Generalization and Detection
Haoyue Bai, Gregory Canal, Xuefeng Du, Jeongyeol Kwon, Robert Nowak, Yixuan Li
BED: Bi-Encoder-Based Detectors for Out-of-Distribution Detection
Louis Owen, Biddwan Ahmed, Abhay Kumar
Unleashing Mask: Explore the Intrinsic Out-of-Distribution Detection Capability
Jianing Zhu, Hengzhuang Li, Jiangchao Yao, Tongliang Liu, Jianliang Xu, Bo Han
A Functional Data Perspective and Baseline On Multi-Layer Out-of-Distribution Detection
Eduardo Dadalto, Pierre Colombo, Guillaume Staerman, Nathan Noiry, Pablo Piantanida