Cross Entropy
Cross-entropy is a fundamental measure of the difference between two probability distributions, frequently used as a loss function in machine learning to train models that predict probabilities. Current research focuses on improving its application in various contexts, including optimizing hyperparameters for deep neural networks, enhancing multilingual speech recognition for low-resource languages, and developing novel generative models for tabular data. These advancements are significant because they improve the accuracy, efficiency, and robustness of machine learning models across diverse applications, from robotics and natural language processing to medical image analysis and autonomous driving.
Papers
Dynamic Alignment Mask CTC: Improved Mask-CTC with Aligned Cross Entropy
Xulong Zhang, Haobin Tang, Jianzong Wang, Ning Cheng, Jian Luo, Jing Xiao
On the Implicit Geometry of Cross-Entropy Parameterizations for Label-Imbalanced Data
Tina Behnia, Ganesh Ramachandra Kini, Vala Vakilian, Christos Thrampoulidis