Loss Function
Loss functions are crucial components of machine learning models, guiding the learning process by quantifying the difference between predicted and actual values. Current research emphasizes developing loss functions tailored to specific challenges, such as class imbalance in classification (addressed through asymmetric losses and hyperparameter distributions) and robustness to noise and outliers (using bounded and smooth alternatives to standard functions like mean squared error). These advancements improve model accuracy, efficiency, and generalizability across diverse applications, including medical image analysis, time series prediction, and physics-informed neural networks. The ongoing exploration of loss function design directly impacts the performance and reliability of machine learning models in various scientific and engineering domains.
Papers
Generalization Bounds and Model Complexity for Kolmogorov-Arnold Networks
Xianyang Zhang, Huijuan Zhou
Metamizer: a versatile neural optimizer for fast and accurate physics simulations
Nils Wandel, Stefan Schulz, Reinhard Klein
Adaptive Real-Time Multi-Loss Function Optimization Using Dynamic Memory Fusion Framework: A Case Study on Breast Cancer Segmentation
Amin Golnari, Mostafa Diba
Advancing RVFL networks: Robust classification with the HawkEye loss function
Mushir Akhtar, Ritik Mishra, M. Tanveer, Mohd. Arshad
Posterior-Mean Rectified Flow: Towards Minimum MSE Photo-Realistic Image Restoration
Guy Ohayon, Tomer Michaeli, Michael Elad
A Taxonomy of Loss Functions for Stochastic Optimal Control
Carles Domingo-Enrich
Is All Learning (Natural) Gradient Descent?
Lucas Shoji, Kenta Suzuki, Leo Kozachkov
An Explicit Consistency-Preserving Loss Function for Phase Reconstruction and Speech Enhancement
Pin-Jui Ku, Chun-Wei Ho, Hao Yen, Sabato Marco Siniscalchi, Chin-Hui Lee
TE-PINN: Quaternion-Based Orientation Estimation using Transformer-Enhanced Physics-Informed Neural Networks
Arman Asgharpoor Golroudbari