Paper ID: 2203.13517

Sparse Federated Learning with Hierarchical Personalized Models

Xiaofeng Liu, Qing Wang, Yunfeng Shao, Yinchuan Li

Federated learning (FL) can achieve privacy-safe and reliable collaborative training without collecting users' private data. Its excellent privacy security potential promotes a wide range of FL applications in Internet-of-Things (IoT), wireless networks, mobile devices, autonomous vehicles, and cloud medical treatment. However, the FL method suffers from poor model performance on non-i.i.d. data and excessive traffic volume. To this end, we propose a personalized FL algorithm using a hierarchical proximal mapping based on the moreau envelop, named sparse federated learning with hierarchical personalized models (sFedHP), which significantly improves the global model performance facing diverse data. A continuously differentiable approximated L1-norm is also used as the sparse constraint to reduce the communication cost. Convergence analysis shows that sFedHP's convergence rate is state-of-the-art with linear speedup and the sparse constraint only reduces the convergence rate to a small extent while significantly reducing the communication cost. Experimentally, we demonstrate the benefits of sFedHP compared with the FedAvg, HierFAVG (hierarchical FedAvg), and personalized FL methods based on local customization, including FedAMP, FedProx, Per-FedAvg, pFedMe, and pFedGP.

Submitted: Mar 25, 2022