Convex Function
Convex functions, characterized by possessing a single minimum, are central to optimization problems across numerous scientific fields. Current research emphasizes developing efficient algorithms for solving complex problems involving convex functions, including those arising in distributed settings, non-smooth scenarios, and those with constraints. This focus includes advancements in gradient descent methods, fractional programming techniques, and the use of convex duality to reformulate non-convex problems. The resulting improvements in computational efficiency and robustness have significant implications for applications ranging from machine learning and signal processing to robotics and power systems optimization.
Papers
Convergence rate of the (1+1)-evolution strategy on locally strongly convex functions with lipschitz continuous gradient and their monotonic transformations
Daiki Morinaga, Kazuto Fukuchi, Jun Sakuma, Youhei Akimoto
Quantum Speedups of Optimizing Approximately Convex Functions with Applications to Logarithmic Regret Stochastic Convex Bandits
Tongyang Li, Ruizhe Zhang