Loss Function
Loss functions are crucial components of machine learning models, guiding the learning process by quantifying the difference between predicted and actual values. Current research emphasizes developing loss functions tailored to specific challenges, such as class imbalance in classification (addressed through asymmetric losses and hyperparameter distributions) and robustness to noise and outliers (using bounded and smooth alternatives to standard functions like mean squared error). These advancements improve model accuracy, efficiency, and generalizability across diverse applications, including medical image analysis, time series prediction, and physics-informed neural networks. The ongoing exploration of loss function design directly impacts the performance and reliability of machine learning models in various scientific and engineering domains.
Papers
Convergence and Implicit Regularization Properties of Gradient Descent for Deep Residual Networks
Rama Cont, Alain Rossier, RenYuan Xu
Constructing Open Cloze Tests Using Generation and Discrimination Capabilities of Transformers
Mariano Felice, Shiva Taslimipoor, Paula Buttery
Surface Similarity Parameter: A New Machine Learning Loss Metric for Oscillatory Spatio-Temporal Data
Mathies Wedler, Merten Stender, Marco Klein, Svenja Ehlers, Norbert Hoffmann
Analysis of Different Losses for Deep Learning Image Colorization
Coloma Ballester, Aurélie Bugeau, Hernan Carrillo, Michaël Clément, Rémi Giraud, Lara Raad, Patricia Vitoria
PAGP: A physics-assisted Gaussian process framework with active learning for forward and inverse problems of partial differential equations
Jiahao Zhang, Shiqi Zhang, Guang Lin