Empirical Risk Minimization
Empirical Risk Minimization (ERM) is a fundamental machine learning principle aiming to find models that minimize prediction errors on training data. Current research focuses on improving ERM's robustness and generalization ability through techniques like regularization (including relative entropy and f-divergences), distributionally robust optimization, and model aggregation strategies that prioritize variance reduction over error minimization. These advancements address challenges such as overfitting, adversarial attacks, and fairness concerns, leading to more reliable and trustworthy machine learning models with broader applicability across diverse fields.
Papers
August 3, 2024
July 29, 2024
July 24, 2024
July 18, 2024
July 11, 2024
July 8, 2024
June 27, 2024
June 17, 2024
June 10, 2024
June 7, 2024
June 5, 2024
June 3, 2024
June 1, 2024
May 23, 2024
April 25, 2024
April 7, 2024
March 15, 2024
March 12, 2024
February 27, 2024