Loss Function
Loss functions are crucial components of machine learning models, guiding the learning process by quantifying the difference between predicted and actual values. Current research emphasizes developing loss functions tailored to specific challenges, such as class imbalance in classification (addressed through asymmetric losses and hyperparameter distributions) and robustness to noise and outliers (using bounded and smooth alternatives to standard functions like mean squared error). These advancements improve model accuracy, efficiency, and generalizability across diverse applications, including medical image analysis, time series prediction, and physics-informed neural networks. The ongoing exploration of loss function design directly impacts the performance and reliability of machine learning models in various scientific and engineering domains.
Papers
Understanding the bias-variance tradeoff of Bregman divergences
Ben Adlam, Neha Gupta, Zelda Mariet, Jamie Smith
An Improved Analysis of Gradient Tracking for Decentralized Machine Learning
Anastasia Koloskova, Tao Lin, Sebastian U. Stich
Penalizing Gradient Norm for Efficiently Improving Generalization in Deep Learning
Yang Zhao, Hao Zhang, Xiuyuan Hu
DeepCENT: Prediction of Censored Event Time via Deep Learning
Jong-Hyeon Jeong, Yichen Jia