Lower Level Convexity Simplification

Lower-level convexity simplification focuses on modifying or approximating non-convex optimization problems to leverage the efficiency and theoretical guarantees associated with convexity. Current research explores techniques like Moreau envelopes, adaptive multi-gradient methods, and convexification via heat evolution, often applied within the context of neural networks, probabilistic inference, and bi-level optimization. These advancements aim to improve the scalability and reliability of algorithms for various machine learning tasks and other applications where non-convexity poses significant challenges. The resulting improvements in computational efficiency and theoretical understanding have broad implications across numerous scientific fields.

Papers