Communication Complexity
Communication complexity studies the amount of communication needed for distributed computation, aiming to minimize this cost while maintaining accuracy. Current research focuses on optimizing communication in various machine learning settings, including federated learning and decentralized optimization, often employing techniques like local updates, compression, and variance reduction within algorithms such as gradient tracking and primal-dual methods. These advancements are crucial for scaling machine learning to massive datasets and resource-constrained environments, impacting fields like distributed training, multi-agent systems, and privacy-preserving computation.
Papers
October 30, 2024
August 30, 2024
August 26, 2024
July 2, 2024
June 4, 2024
April 16, 2024
March 23, 2024
March 7, 2024
February 16, 2024
February 13, 2024
February 9, 2024
February 6, 2024
January 10, 2024
December 22, 2023
December 19, 2023
November 19, 2023
October 12, 2023
October 10, 2023
September 23, 2023