Softmax Loss
Softmax loss is a fundamental component of many deep learning models, primarily used for multi-class classification by assigning probabilities to different classes. Current research focuses on improving softmax loss's performance and robustness through modifications like incorporating margins, adaptive distillation, and alternative loss functions that address issues such as noisy data, imbalanced classes, and the need for efficient model compression. These advancements lead to improved accuracy, fairness, and robustness in various applications, including face recognition, recommendation systems, and visual recognition tasks, impacting both the theoretical understanding and practical deployment of deep learning models.
Papers
October 31, 2024
July 1, 2024
June 23, 2024
May 28, 2024
January 4, 2024
December 20, 2023
November 14, 2023
November 4, 2023
October 16, 2023
August 24, 2023
March 22, 2023
March 14, 2023
February 3, 2023
January 15, 2023
December 28, 2022
October 31, 2022
August 24, 2022
June 23, 2022