Class Uncertainty

Class uncertainty, in the context of machine learning, refers to the quantification and mitigation of a model's confidence in its predictions, particularly concerning the ambiguity inherent in class boundaries or the presence of imbalanced datasets. Current research focuses on developing novel metrics beyond simple class cardinality to better capture class uncertainty, incorporating these metrics into existing class imbalance mitigation techniques, and leveraging uncertainty estimates for improved model training and generalization, often employing evidential deep learning and techniques like Monte Carlo Dropout. Understanding and addressing class uncertainty is crucial for building more reliable and robust machine learning systems, particularly in high-stakes applications where accurate prediction confidence is paramount.

Papers