Federated Distillation
Federated distillation (FD) is a distributed machine learning approach that leverages knowledge distillation to efficiently train models across multiple clients without directly sharing sensitive data. Current research focuses on improving FD's robustness to adversarial attacks, addressing data heterogeneity among clients, and enhancing communication efficiency through techniques like accumulating local updates and selective knowledge sharing. This approach holds significant promise for privacy-preserving collaborative learning in various applications, particularly in resource-constrained environments like edge networks and medical image analysis, by enabling the training of large models without compromising data security.
Papers
January 14, 2023
January 1, 2023
May 30, 2022
May 23, 2022
April 30, 2022
April 14, 2022
March 14, 2022
December 30, 2021