Federated Generative

Federated generative models combine federated learning's privacy-preserving distributed training with the data generation capabilities of generative models. Current research emphasizes efficient algorithms, such as those employing one-shot learning or on-demand quantization, to reduce communication overhead and energy consumption, particularly for resource-constrained devices like those in the Internet of Things. This approach addresses the challenges of training generative models on decentralized, heterogeneous data while maintaining data privacy, with applications ranging from robotics and drug discovery to personalized travel time estimation. The resulting models offer improved efficiency and robustness compared to traditional centralized methods.

Papers