Optimality Gap

Optimality gap quantifies the difference between a solution obtained by an algorithm and the true optimal solution to an optimization problem. Current research focuses on reducing this gap in various contexts, including decentralized systems (using probabilistic communication and sequential greedy algorithms), federated learning (employing normalized gradients to handle Byzantine attacks and non-IID data), and machine learning-augmented optimization (leveraging deep neural networks and convex relaxations). Addressing the optimality gap is crucial for improving the efficiency and reliability of algorithms across diverse fields, from resource allocation in communication networks to solving complex combinatorial optimization problems.

Papers