Multilevel Optimization

Multilevel optimization tackles complex problems by breaking them into hierarchical levels, allowing for efficient solutions to nested or interdependent optimization tasks. Current research focuses on developing provably convergent algorithms, particularly within federated learning settings and for applications like hyperparameter optimization and neural architecture search, often employing techniques like gossip-based distributed optimization and $\mu$-cuts for non-convex functions. These advancements are significant for improving the scalability and efficiency of machine learning models, impacting fields ranging from medical image registration to fair classification and enabling the training of larger, more accurate models.

Papers