Distribution Detection
Out-of-distribution (OOD) detection aims to identify data points that differ significantly from a machine learning model's training data, crucial for ensuring reliable and safe model deployment. Current research focuses on developing novel scoring functions and model architectures, including those based on diffusion models, variational autoencoders, and vision-language models, to improve the accuracy and efficiency of OOD detection, often addressing challenges posed by imbalanced datasets and limited access to model parameters. This field is vital for enhancing the trustworthiness of AI systems across diverse applications, from autonomous driving to medical diagnosis, by mitigating the risks associated with making predictions on unseen data. A growing emphasis is placed on developing methods that are both effective and computationally efficient, particularly for resource-constrained environments.
Papers
Out-of-Distribution Detection and Data Drift Monitoring using Statistical Process Control
Ghada Zamzmi, Kesavan Venkatesh, Brandon Nelson, Smriti Prathapan, Paul H. Yi, Berkman Sahiner, Jana G. Delfino
TriAug: Out-of-Distribution Detection for Imbalanced Breast Lesion in Ultrasound
Yinyu Ye, Shijing Chen, Dong Ni, Ruobing Huang
How Does Unlabeled Data Provably Help Out-of-Distribution Detection?
Xuefeng Du, Zhen Fang, Ilias Diakonikolas, Yixuan Li
Kernel PCA for Out-of-Distribution Detection
Kun Fang, Qinghua Tao, Kexin Lv, Mingzhen He, Xiaolin Huang, Jie Yang
Learning with Mixture of Prototypes for Out-of-Distribution Detection
Haodong Lu, Dong Gong, Shuo Wang, Jason Xue, Lina Yao, Kristen Moore