Decentralized Federated Learning
Decentralized Federated Learning (DFL) is a collaborative machine learning framework that trains models across distributed devices without a central server, enhancing privacy and robustness. Current research focuses on addressing challenges like data heterogeneity and Byzantine attacks through novel aggregation algorithms, optimized communication strategies (including model caching and adaptive sampling), and the exploration of various network topologies. This serverless approach improves scalability and resilience compared to centralized federated learning, with significant implications for applications requiring privacy-preserving distributed training, such as healthcare and IoT.
Papers
January 25, 2024
December 21, 2023
December 18, 2023
December 7, 2023
November 27, 2023
November 14, 2023
October 20, 2023
October 12, 2023
October 10, 2023
October 8, 2023
October 4, 2023
September 14, 2023
September 6, 2023
August 16, 2023
August 12, 2023
August 9, 2023
August 8, 2023
July 29, 2023
July 21, 2023