First Order Algorithm

First-order algorithms are iterative methods for optimization problems that utilize only gradient information, offering computational efficiency for large-scale applications. Current research focuses on improving their convergence rates and applicability to diverse problem classes, including non-convex and min-max problems, often employing techniques like variance reduction, adaptive step sizes, and entropy regularization within algorithms such as stochastic gradient descent and its variants. These advancements are crucial for scaling machine learning models and solving complex problems in fields like wildfire science and optimal transport, where efficient optimization is paramount.

Papers