Open Source Federated Learning Framework
Open-source federated learning (FL) frameworks aim to facilitate collaborative model training across decentralized datasets without compromising data privacy. Current research emphasizes improving the efficiency and scalability of these frameworks, evaluating the performance and stability of various FL algorithms (like FedAvg, FedProx, and more advanced variants), and addressing challenges such as open-vocabulary learning and ensuring privacy preservation, particularly in vertical FL settings. This work is crucial for advancing both the theoretical understanding and practical application of FL, enabling wider adoption in diverse fields while mitigating potential vulnerabilities.
Papers
April 9, 2024
April 1, 2024
March 26, 2024
August 9, 2023
July 5, 2023
September 25, 2022
July 21, 2022
July 19, 2022