Extra Gradient

Extra-gradient methods are optimization algorithms used to solve various problems, including variational inequalities and training machine learning models, with a focus on improving convergence speed and efficiency, particularly in distributed or constrained settings. Current research emphasizes adapting extra-gradient methods for specific applications, such as federated learning (using ADMM), long-tailed classification (balancing gradients), and out-of-distribution detection (leveraging gradient spaces), often incorporating techniques like gradient boosting or stochastic averaging. These advancements are significant for improving the performance and scalability of machine learning algorithms across diverse domains, from computer vision and recommendation systems to solving partial differential equations.

Papers