Newton CG
Newton-Conjugate Gradient (Newton-CG) methods are being actively developed to efficiently find approximate stationary points (both first and second-order) for challenging nonconvex optimization problems, including those with equality and conic constraints. Research focuses on improving the efficiency and complexity guarantees of Newton-CG algorithms, particularly within augmented Lagrangian and barrier methods, often leveraging techniques to handle H\"older continuous Hessians. These advancements offer significant potential for improving the scalability and solution quality of optimization algorithms in various scientific and engineering applications where nonconvexity is prevalent.
Papers
November 22, 2023
January 10, 2023
January 9, 2023