Paper ID: 2312.04166

Improving Communication Efficiency of Federated Distillation via Accumulating Local Updates

Zhiyuan Wu, Sheng Sun, Yuwei Wang, Min Liu, Tian Wen, Wen Wang

As an emerging federated learning paradigm, federated distillation enables communication-efficient model training by transmitting only small-scale knowledge during the learning process. To further improve the communication efficiency of federated distillation, we propose a novel technique, ALU, which accumulates multiple rounds of local updates before transferring the knowledge to the central server. ALU drastically decreases the frequency of communication in federated distillation, thereby significantly reducing the communication overhead during the training process. Empirical experiments demonstrate the substantial effect of ALU in improving the communication efficiency of federated distillation.

Submitted: Dec 7, 2023