Coordinate Descent

Coordinate descent is an optimization method that iteratively updates a single variable (or a small block of variables) at a time, aiming to minimize a complex objective function. Current research focuses on extending its application to various challenging problems, including Bayesian network learning, neural network training, and solving extensive-form games, often incorporating techniques like line search, gradient information, and adaptive sampling strategies to improve efficiency and convergence. This approach offers significant advantages in scalability and computational efficiency for large-scale problems across diverse fields, from machine learning and optimization to robotics and federated learning. The development of robust and theoretically sound coordinate descent algorithms continues to be a major focus, leading to improved performance and broader applicability in numerous domains.

Papers