Communication Complexity
Communication complexity studies the amount of communication needed for distributed computation, aiming to minimize this cost while maintaining accuracy. Current research focuses on optimizing communication in various machine learning settings, including federated learning and decentralized optimization, often employing techniques like local updates, compression, and variance reduction within algorithms such as gradient tracking and primal-dual methods. These advancements are crucial for scaling machine learning to massive datasets and resource-constrained environments, impacting fields like distributed training, multi-agent systems, and privacy-preserving computation.
Papers
July 7, 2022
June 30, 2022
May 31, 2022
April 22, 2022
April 15, 2022
March 19, 2022
March 10, 2022
February 10, 2022
February 2, 2022
December 24, 2021