Convex Objective
Convex objective functions are central to many optimization problems in machine learning and related fields, with the primary goal being to efficiently find their minimum. Current research focuses on developing and analyzing algorithms, such as Frank-Wolfe and variants of stochastic gradient descent (SGD), that are efficient even under constraints (e.g., projection-free methods) or in distributed settings like federated learning. These advancements are crucial for improving the scalability and robustness of machine learning models, particularly in applications with large datasets, privacy concerns, or unreliable communication environments.
Papers
November 13, 2024
November 1, 2024
October 22, 2024
August 19, 2024
June 6, 2024
May 2, 2024
February 23, 2024
December 14, 2023
October 26, 2023
September 29, 2023
May 31, 2023
January 25, 2023
January 14, 2023
November 18, 2022
October 11, 2022
February 24, 2022
January 31, 2022
December 31, 2021
December 3, 2021