Centralized Federated Learning

Centralized federated learning (CFL) aims to collaboratively train machine learning models across multiple devices without directly sharing their data, preserving privacy while improving model accuracy. Current research emphasizes improving efficiency and scalability, exploring various aggregation methods and addressing challenges posed by heterogeneous data distributions, often employing deep learning models like Long Short-Term Memory networks. This approach is significant for its potential to enhance the performance of distributed machine learning applications while upholding data privacy, impacting fields like IoT and cybersecurity. Furthermore, research is actively exploring methods for fairly rewarding data contributors in CFL settings.

Papers