Gradient Complexity
Gradient complexity, a measure of the computational cost of optimization algorithms, focuses on minimizing the number of gradient calculations needed to achieve a desired solution accuracy. Current research emphasizes developing algorithms with improved gradient complexity for various settings, including distributed and decentralized optimization, continual learning, and non-convex problems, often employing variance reduction techniques and primal-dual methods. These advancements are crucial for improving the efficiency and scalability of machine learning models, particularly in large-scale applications where computational resources are limited. The ultimate goal is to design algorithms that achieve optimal or near-optimal gradient complexity across a range of problem types and settings.