One Shot Federated Learning

One-shot federated learning (OSFL) aims to train a shared machine learning model across multiple devices with only a single round of communication, minimizing privacy risks and communication overhead compared to traditional federated learning. Current research heavily utilizes diffusion models and generative adversarial networks to synthesize data representative of each client's local data, enabling effective model aggregation despite data heterogeneity and limited client data. This approach addresses key challenges in federated learning, improving efficiency and privacy while enabling collaborative model training in resource-constrained or privacy-sensitive environments, with applications ranging from medical imaging to the Internet of Things.

Papers