Loss Function
Loss functions are crucial components of machine learning models, guiding the learning process by quantifying the difference between predicted and actual values. Current research emphasizes developing loss functions tailored to specific challenges, such as class imbalance in classification (addressed through asymmetric losses and hyperparameter distributions) and robustness to noise and outliers (using bounded and smooth alternatives to standard functions like mean squared error). These advancements improve model accuracy, efficiency, and generalizability across diverse applications, including medical image analysis, time series prediction, and physics-informed neural networks. The ongoing exploration of loss function design directly impacts the performance and reliability of machine learning models in various scientific and engineering domains.
Papers
I Know Therefore I Score: Label-Free Crafting of Scoring Functions using Constraints Based on Domain Expertise
Ragja Palakkadavath, Sarath Sivaprasad, Shirish Karande, Niranjan Pedanekar
Semi-Supervised Learning with Mutual Distillation for Monocular Depth Estimation
Jongbeom Baek, Gyeongnyeon Kim, Seungryong Kim
Renyi Fair Information Bottleneck for Image Classification
Adam Gronowski, William Paul, Fady Alajaji, Bahman Gharesifard, Philippe Burlina
Probabilistic Rotation Representation With an Efficiently Computable Bingham Loss Function and Its Application to Pose Estimation
Hiroya Sato, Takuya Ikeda, Koichi Nishiwaki
Cutting Some Slack for SGD with Adaptive Polyak Stepsizes
Robert M. Gower, Mathieu Blondel, Nidham Gazagnadou, Fabian Pedregosa
Physics-Informed Neural Networks for Quantum Eigenvalue Problems
Henry Jin, Marios Mattheakis, Pavlos Protopapas
Loss as the Inconsistency of a Probabilistic Dependency Graph: Choose Your Model, Not Your Loss Function
Oliver E Richardson