Decentralized Federated Learning
Decentralized Federated Learning (DFL) is a collaborative machine learning framework that trains models across distributed devices without a central server, enhancing privacy and robustness. Current research focuses on addressing challenges like data heterogeneity and Byzantine attacks through novel aggregation algorithms, optimized communication strategies (including model caching and adaptive sampling), and the exploration of various network topologies. This serverless approach improves scalability and resilience compared to centralized federated learning, with significant implications for applications requiring privacy-preserving distributed training, such as healthcare and IoT.
Papers
Boosting the Performance of Decentralized Federated Learning via Catalyst Acceleration
Qinglun Li, Miao Zhang, Yingqi Liu, Quanjun Yin, Li Shen, Xiaochun Cao
OledFL: Unleashing the Potential of Decentralized Federated Learning via Opposite Lookahead Enhancement
Qinglun Li, Miao Zhang, Mengzhu Wang, Quanjun Yin, Li Shen