Cross Entropy
Cross-entropy is a fundamental measure of the difference between two probability distributions, frequently used as a loss function in machine learning to train models that predict probabilities. Current research focuses on improving its application in various contexts, including optimizing hyperparameters for deep neural networks, enhancing multilingual speech recognition for low-resource languages, and developing novel generative models for tabular data. These advancements are significant because they improve the accuracy, efficiency, and robustness of machine learning models across diverse applications, from robotics and natural language processing to medical image analysis and autonomous driving.
Papers
March 14, 2023
March 13, 2023
March 8, 2023
March 7, 2023
February 19, 2023
February 17, 2023
February 10, 2023
December 16, 2022
December 8, 2022
December 7, 2022
November 23, 2022
October 16, 2022
October 8, 2022
October 4, 2022
October 3, 2022
September 21, 2022
August 22, 2022
August 7, 2022