Communication Efficient
Communication-efficient methods in machine learning aim to reduce the computational and communication overhead associated with training and deploying large models, particularly in distributed or federated settings. Current research focuses on developing efficient algorithms, such as variations of stochastic gradient descent (SGD) with compression techniques (e.g., sparsification, quantization), and novel architectures like low-rank models and federated learning strategies that minimize data exchange. These advancements are crucial for enabling the deployment of complex models on resource-constrained devices and for scaling machine learning to massive datasets while preserving privacy and reducing training time.
Papers
November 25, 2024
October 30, 2024
October 29, 2024
October 9, 2024
October 7, 2024
September 30, 2024
September 13, 2024
July 31, 2024
July 12, 2024
June 26, 2024
June 20, 2024
June 12, 2024
May 31, 2024
May 2, 2024
May 1, 2024
April 9, 2024
March 7, 2024
February 1, 2024
January 22, 2024
December 7, 2023