Communication Complexity

Communication complexity studies the amount of communication needed for distributed computation, aiming to minimize this cost while maintaining accuracy. Current research focuses on optimizing communication in various machine learning settings, including federated learning and decentralized optimization, often employing techniques like local updates, compression, and variance reduction within algorithms such as gradient tracking and primal-dual methods. These advancements are crucial for scaling machine learning to massive datasets and resource-constrained environments, impacting fields like distributed training, multi-agent systems, and privacy-preserving computation.

Papers