Convex Objective

Convex objective functions are central to many optimization problems in machine learning and related fields, with the primary goal being to efficiently find their minimum. Current research focuses on developing and analyzing algorithms, such as Frank-Wolfe and variants of stochastic gradient descent (SGD), that are efficient even under constraints (e.g., projection-free methods) or in distributed settings like federated learning. These advancements are crucial for improving the scalability and robustness of machine learning models, particularly in applications with large datasets, privacy concerns, or unreliable communication environments.

Papers