Loss Function
Loss functions are crucial components of machine learning models, guiding the learning process by quantifying the difference between predicted and actual values. Current research emphasizes developing loss functions tailored to specific challenges, such as class imbalance in classification (addressed through asymmetric losses and hyperparameter distributions) and robustness to noise and outliers (using bounded and smooth alternatives to standard functions like mean squared error). These advancements improve model accuracy, efficiency, and generalizability across diverse applications, including medical image analysis, time series prediction, and physics-informed neural networks. The ongoing exploration of loss function design directly impacts the performance and reliability of machine learning models in various scientific and engineering domains.
Papers
On the uncertainty principle of neural networks
Jun-Jie Zhang, Dong-Xiao Zhang, Jian-Nan Chen, Long-Gang Pang, Deyu Meng
Revisiting Communication-Efficient Federated Learning with Balanced Global and Local Updates
Zhigang Yan, Dong Li, Zhichao Zhang, Jiguang He
Data Determines Distributional Robustness in Contrastive Language Image Pre-training (CLIP)
Alex Fang, Gabriel Ilharco, Mitchell Wortsman, Yuhao Wan, Vaishaal Shankar, Achal Dave, Ludwig Schmidt
AutoLossGen: Automatic Loss Function Generation for Recommender Systems
Zelong Li, Jianchao Ji, Yingqiang Ge, Yongfeng Zhang
Extremal GloVe: Theoretically Accurate Distributed Word Embedding by Tail Inference
Hao Wang
Supervised Contrastive CSI Representation Learning for Massive MIMO Positioning
Junquan Deng, Wei Shi, Jianzhao Zhang, Xianyu Zhang, Chuan Zhang