Convergence Property

Convergence properties in machine learning and optimization are a central focus of current research, aiming to understand and improve the speed and reliability of algorithms in reaching optimal solutions. Active areas include analyzing the convergence of stochastic gradient descent (SGD) variants like random reshuffling, and exploring novel algorithms such as fractional gradient descent and those operating on Riemannian manifolds, often within federated learning frameworks. These investigations are crucial for developing more efficient and robust machine learning models and optimization techniques across diverse applications, from image reconstruction to distributionally robust optimization.

Papers