Based Out of Distribution Detection
Out-of-distribution (OOD) detection aims to identify data points that differ significantly from the training data of a machine learning model, improving the robustness and reliability of predictions in real-world applications. Current research focuses on enhancing existing methods like deep metric learning and autoencoders, exploring novel approaches such as leveraging language models for improved interpretability and employing synthetic data generation to address data scarcity. These advancements are crucial for deploying reliable machine learning systems in safety-critical domains like autonomous driving and cyber-physical systems, where misclassifications can have severe consequences.
Papers
May 2, 2024
May 1, 2024
November 6, 2023
October 10, 2023
July 29, 2022