Decentralized Federated Learning
Decentralized Federated Learning (DFL) is a collaborative machine learning framework that trains models across distributed devices without a central server, enhancing privacy and robustness. Current research focuses on addressing challenges like data heterogeneity and Byzantine attacks through novel aggregation algorithms, optimized communication strategies (including model caching and adaptive sampling), and the exploration of various network topologies. This serverless approach improves scalability and resilience compared to centralized federated learning, with significant implications for applications requiring privacy-preserving distributed training, such as healthcare and IoT.
Papers
June 16, 2023
June 5, 2023
June 2, 2023
April 28, 2023
April 13, 2023
March 19, 2023
February 23, 2023
February 8, 2023
January 12, 2023
January 7, 2023
December 16, 2022
December 14, 2022
November 23, 2022
November 15, 2022
November 7, 2022
November 5, 2022
October 25, 2022
October 20, 2022
October 1, 2022