Cross Entropy
Cross-entropy is a fundamental measure of the difference between two probability distributions, frequently used as a loss function in machine learning to train models that predict probabilities. Current research focuses on improving its application in various contexts, including optimizing hyperparameters for deep neural networks, enhancing multilingual speech recognition for low-resource languages, and developing novel generative models for tabular data. These advancements are significant because they improve the accuracy, efficiency, and robustness of machine learning models across diverse applications, from robotics and natural language processing to medical image analysis and autonomous driving.
Papers
Uncertainty Quantification for Bird's Eye View Semantic Segmentation: Methods and Benchmarks
Linlin Yu, Bowen Yang, Tianhao Wang, Kangshuo Li, Feng Chen
Aligning Multiclass Neural Network Classifier Criterion with Task Performance via $F_\beta$-Score
Nathan Tsoi, Deyuan Li, Taesoo Daniel Lee, Marynel Vázquez