Smooth Convex

Smooth convex optimization focuses on efficiently finding minima of functions that are both smooth (possessing continuous derivatives) and convex (curving upwards). Current research emphasizes developing and analyzing algorithms like stochastic gradient descent, accelerated coordinate descent methods, and particle gradient descent, often incorporating techniques like variance reduction and preconditioning to improve convergence rates. These advancements are crucial for tackling large-scale problems in machine learning, robotics (e.g., motion planning), and other fields where efficient optimization of smooth convex functions is essential for effective model training and control. The development of novel frameworks for constructing "safe" regions containing optimal solutions also represents a significant area of progress.

Papers