Decentralized FL
Decentralized federated learning (DFL) aims to collaboratively train machine learning models across multiple devices or agents without relying on a central server, enhancing data privacy and scalability. Current research focuses on addressing challenges like robustness to malicious actors (Byzantine attacks), handling data and model heterogeneity across participating entities, and improving communication efficiency through techniques such as one-bit compressive sensing and optimized consensus algorithms. DFL's significance lies in its potential to enable large-scale collaborative learning in privacy-sensitive applications and resource-constrained environments, impacting diverse fields from healthcare and finance to robotics and IoT.
Papers
January 21, 2023
January 13, 2023
January 12, 2023
December 16, 2022
December 5, 2022
November 28, 2022
October 28, 2022
October 20, 2022
October 7, 2022
September 20, 2022
June 30, 2022
June 22, 2022
June 17, 2022
June 6, 2022
June 5, 2022
May 31, 2022
May 7, 2022
April 27, 2022
April 7, 2022