Loss Function
Loss functions are crucial components of machine learning models, guiding the learning process by quantifying the difference between predicted and actual values. Current research emphasizes developing loss functions tailored to specific challenges, such as class imbalance in classification (addressed through asymmetric losses and hyperparameter distributions) and robustness to noise and outliers (using bounded and smooth alternatives to standard functions like mean squared error). These advancements improve model accuracy, efficiency, and generalizability across diverse applications, including medical image analysis, time series prediction, and physics-informed neural networks. The ongoing exploration of loss function design directly impacts the performance and reliability of machine learning models in various scientific and engineering domains.
Papers
Robust Loss Functions for Training Decision Trees with Noisy Labels
Jonathan Wilton, Nan Ye
BSL: Understanding and Improving Softmax Loss for Recommendation
Junkang Wu, Jiawei Chen, Jiancan Wu, Wentao Shi, Jizhi Zhang, Xiang Wang
Single-channel speech enhancement using learnable loss mixup
Oscar Chang, Dung N. Tran, Kazuhito Koishida
Loss Functions in the Era of Semantic Segmentation: A Survey and Outlook
Reza Azad, Moein Heidary, Kadir Yilmaz, Michael Hüttemann, Sanaz Karimijafarbigloo, Yuli Wu, Anke Schmeink, Dorit Merhof
TaskMet: Task-Driven Metric Learning for Model Learning
Dishank Bansal, Ricky T. Q. Chen, Mustafa Mukadam, Brandon Amos