Shot Out of Distribution Detection
Shot out of distribution (OOD) detection focuses on identifying data points that differ significantly from the training data used to build a machine learning model, a crucial task for reliable real-world applications. Current research emphasizes few-shot OOD detection, aiming to achieve accurate identification with minimal labeled examples, often leveraging pre-trained models like CLIP and incorporating techniques such as prompt learning, contrastive fine-tuning, and adaptive pseudo-labeling to improve performance. This area is vital for enhancing the robustness and safety of AI systems across various domains, from image recognition to natural language processing, by mitigating the risks associated with unexpected or adversarial inputs.
Papers
June 4, 2024
May 25, 2024
March 30, 2024
February 29, 2024
November 21, 2023
November 20, 2023
October 20, 2023
June 2, 2023
April 13, 2023
December 12, 2021