Distribution Detection
Out-of-distribution (OOD) detection aims to identify data points that differ significantly from a machine learning model's training data, crucial for ensuring reliable and safe model deployment. Current research focuses on developing novel scoring functions and model architectures, including those based on diffusion models, variational autoencoders, and vision-language models, to improve the accuracy and efficiency of OOD detection, often addressing challenges posed by imbalanced datasets and limited access to model parameters. This field is vital for enhancing the trustworthiness of AI systems across diverse applications, from autonomous driving to medical diagnosis, by mitigating the risks associated with making predictions on unseen data. A growing emphasis is placed on developing methods that are both effective and computationally efficient, particularly for resource-constrained environments.
Papers
Logit Scaling for Out-of-Distribution Detection
Andrija Djurisic, Rosanne Liu, Mladen Nikolic
DNN-GDITD: Out-of-distribution detection via Deep Neural Network based Gaussian Descriptor for Imbalanced Tabular Data
Priyanka Chudasama, Anil Surisetty, Aakarsh Malhotra, Alok Singh
Compressing VAE-Based Out-of-Distribution Detectors for Embedded Deployment
Aditya Bansal, Michael Yuhas, Arvind Easwaran
Hierarchical Visual Categories Modeling: A Joint Representation Learning and Density Estimation Framework for Out-of-Distribution Detection
Jinglun Li, Xinyu Zhou, Pinxue Guo, Yixuan Sun, Yiwen Huang, Weifeng Ge, Wenqiang Zhang
TagOOD: A Novel Approach to Out-of-Distribution Detection via Vision-Language Representations and Class Center Learning
Jinglun Li, Xinyu Zhou, Kaixun Jiang, Lingyi Hong, Pinxue Guo, Zhaoyu Chen, Weifeng Ge, Wenqiang Zhang