Empirical Risk

Empirical risk minimization (ERM) aims to find models that minimize the average loss on a training dataset, a crucial step in machine learning. Current research focuses on improving ERM's robustness and generalization ability, exploring techniques like tilted empirical risk, variance minimization in model aggregation, and regularization methods using divergences such as f-divergences and Rényi divergences. These advancements address challenges like overfitting, noisy labels, and the need for reliable uncertainty quantification, impacting the development of more accurate and trustworthy machine learning models across diverse applications. Furthermore, research investigates the theoretical properties of ERM under various conditions, including heavy-tailed data distributions and the impact of data augmentation.

Papers