First Order Method

First-order methods are optimization algorithms that utilize only gradient information to iteratively find solutions to complex problems, offering computational efficiency compared to higher-order methods. Current research focuses on extending their applicability to challenging problem settings, including bilevel optimization, minimax problems, and online learning scenarios, often employing techniques like accelerated gradient descent and adaptive step sizes to improve convergence rates. These advancements are significant because they enable efficient solutions for large-scale problems in diverse fields such as machine learning, robotics, and resource allocation, where computational cost is a major constraint.

Papers