New ProxSkip Method

ProxSkip is a novel algorithm designed to accelerate federated learning (FL) by reducing communication overhead, a major bottleneck in distributed training. Current research focuses on extending ProxSkip's benefits to non-convex optimization problems and achieving linear speedup with increasing numbers of participating nodes, incorporating techniques like variance reduction and importance sampling to further improve efficiency. This work is significant because it offers a theoretically grounded approach to improving the communication efficiency of FL, potentially enabling the training of larger models on more resource-constrained devices and accelerating progress in various applications of FL.

Papers