Cross Entropy
Cross-entropy is a fundamental measure of the difference between two probability distributions, frequently used as a loss function in machine learning to train models that predict probabilities. Current research focuses on improving its application in various contexts, including optimizing hyperparameters for deep neural networks, enhancing multilingual speech recognition for low-resource languages, and developing novel generative models for tabular data. These advancements are significant because they improve the accuracy, efficiency, and robustness of machine learning models across diverse applications, from robotics and natural language processing to medical image analysis and autonomous driving.
Papers
December 8, 2022
December 7, 2022
November 23, 2022
October 16, 2022
October 8, 2022
October 4, 2022
October 3, 2022
September 21, 2022
August 22, 2022
August 7, 2022
August 5, 2022
June 29, 2022
June 28, 2022
June 24, 2022
June 13, 2022
June 11, 2022
June 7, 2022
May 24, 2022
March 22, 2022