Softmax Probability
Softmax probability, representing the predicted class probabilities in multi-class classification, is a crucial element of deep learning models but suffers from well-documented overconfidence issues, particularly when dealing with out-of-distribution (OOD) data. Current research focuses on improving the reliability of softmax probabilities through methods like energy-based models, prototype-based approaches, and novel confidence measures that leverage ensemble diversity or quantile representations to better reflect model uncertainty. These advancements are vital for enhancing the trustworthiness and robustness of AI systems in real-world applications, particularly in safety-critical domains where accurate uncertainty quantification is paramount.