Shot Out of Distribution Detection

Shot out of distribution (OOD) detection focuses on identifying data points that differ significantly from the training data used to build a machine learning model, a crucial task for reliable real-world applications. Current research emphasizes few-shot OOD detection, aiming to achieve accurate identification with minimal labeled examples, often leveraging pre-trained models like CLIP and incorporating techniques such as prompt learning, contrastive fine-tuning, and adaptive pseudo-labeling to improve performance. This area is vital for enhancing the robustness and safety of AI systems across various domains, from image recognition to natural language processing, by mitigating the risks associated with unexpected or adversarial inputs.

Papers