Communication Efficient
Communication-efficient methods in machine learning aim to reduce the computational and communication overhead associated with training and deploying large models, particularly in distributed or federated settings. Current research focuses on developing efficient algorithms, such as variations of stochastic gradient descent (SGD) with compression techniques (e.g., sparsification, quantization), and novel architectures like low-rank models and federated learning strategies that minimize data exchange. These advancements are crucial for enabling the deployment of complex models on resource-constrained devices and for scaling machine learning to massive datasets while preserving privacy and reducing training time.
Papers
November 14, 2022
October 31, 2022
October 28, 2022
October 10, 2022
September 14, 2022
June 19, 2022
June 16, 2022
June 9, 2022
June 2, 2022
June 1, 2022
May 31, 2022
May 28, 2022
May 27, 2022
May 13, 2022
May 10, 2022
May 9, 2022
May 8, 2022