Conditional Gradient
Conditional gradient methods, also known as Frank-Wolfe algorithms, are projection-free optimization techniques used to solve constrained problems, particularly those arising in machine learning and high-dimensional data analysis. Current research focuses on improving efficiency through variance reduction, stochastic gradient estimation, and the development of accelerated variants like the conditional gradient sliding algorithm, often applied within frameworks such as federated learning and optimal transport. These advancements enhance the applicability of conditional gradient methods to large-scale problems, impacting areas like deep learning training, inverse problems, and robust optimization in settings with noisy or limited data.
Papers
March 28, 2022
February 26, 2022
February 7, 2022