Federated Stochastic

Federated stochastic methods aim to collaboratively train machine learning models across decentralized devices without directly sharing sensitive data. Current research focuses on improving the efficiency and robustness of algorithms like Federated Averaging (FedAvg) and Federated Stochastic Gradient Descent (FedSGD), addressing challenges posed by data heterogeneity and Byzantine attacks through techniques such as gradient scaling, normalization, and dynamic resource allocation. These advancements are significant for privacy-preserving machine learning in diverse applications like medical image analysis and IoT data processing, enabling collaborative model training while maintaining data security and integrity.

Papers