Loss Function
Loss functions are crucial components of machine learning models, guiding the learning process by quantifying the difference between predicted and actual values. Current research emphasizes developing loss functions tailored to specific challenges, such as class imbalance in classification (addressed through asymmetric losses and hyperparameter distributions) and robustness to noise and outliers (using bounded and smooth alternatives to standard functions like mean squared error). These advancements improve model accuracy, efficiency, and generalizability across diverse applications, including medical image analysis, time series prediction, and physics-informed neural networks. The ongoing exploration of loss function design directly impacts the performance and reliability of machine learning models in various scientific and engineering domains.
Papers
Scaling Session-Based Transformer Recommendations using Optimized Negative Sampling and Loss Functions
Timo Wilm, Philipp Normann, Sophie Baumeister, Paul-Vincent Kobow
The Effect of Spoken Language on Speech Enhancement using Self-Supervised Speech Representation Loss Functions
George Close, Thomas Hain, Stefan Goetze
Energy Discrepancies: A Score-Independent Loss for Energy-Based Models
Tobias Schröder, Zijing Ou, Jen Ning Lim, Yingzhen Li, Sebastian J. Vollmer, Andrew B. Duncan
Learning Stochastic Dynamical Systems as an Implicit Regularization with Graph Neural Networks
Jin Guo, Ting Gao, Yufu Lan, Peng Zhang, Sikun Yang, Jinqiao Duan
Mini-Batch Optimization of Contrastive Loss
Jaewoong Cho, Kartik Sreenivasan, Keon Lee, Kyunghoo Mun, Soheun Yi, Jeong-Gwan Lee, Anna Lee, Jy-yong Sohn, Dimitris Papailiopoulos, Kangwook Lee