Riemannian Gradient

Riemannian gradient methods address optimization problems where the solution space is a curved manifold, rather than a flat Euclidean space, a common scenario in machine learning and related fields. Current research focuses on developing efficient algorithms, such as Riemannian gradient descent and its accelerated variants, coordinate descent methods tailored to specific manifolds (e.g., Stiefel, Grassmann), and Riemannian natural gradient methods leveraging second-order information. These advancements improve the efficiency and convergence guarantees of optimization for applications ranging from tensor completion and quantum process tomography to robust rotation synchronization and neural network training, impacting diverse areas of science and engineering.

Papers