Federated Distillation
Federated distillation (FD) is a distributed machine learning approach that leverages knowledge distillation to efficiently train models across multiple clients without directly sharing sensitive data. Current research focuses on improving FD's robustness to adversarial attacks, addressing data heterogeneity among clients, and enhancing communication efficiency through techniques like accumulating local updates and selective knowledge sharing. This approach holds significant promise for privacy-preserving collaborative learning in various applications, particularly in resource-constrained environments like edge networks and medical image analysis, by enabling the training of large models without compromising data security.
Papers
September 14, 2024
July 25, 2024
July 2, 2024
June 16, 2024
April 2, 2024
March 18, 2024
February 19, 2024
February 16, 2024
January 8, 2024
December 28, 2023
December 19, 2023
December 7, 2023
November 28, 2023
October 22, 2023
September 28, 2023
September 16, 2023
August 7, 2023
July 25, 2023
April 4, 2023
March 10, 2023