Semi Decentralized Federated
Semi-decentralized federated learning (SDFL) addresses the limitations of both fully centralized and fully decentralized federated learning approaches by combining aspects of both. Current research focuses on improving SDFL's scalability and robustness through techniques like hierarchical frameworks, blockchain integration for trust management, and asynchronous training algorithms to handle device heterogeneity and intermittent connectivity. This hybrid approach aims to enhance the efficiency and privacy of distributed machine learning, particularly in resource-constrained environments like the Internet of Things and edge computing, by balancing the computational load and communication overhead. The resulting improvements in model accuracy and training speed have significant implications for various applications, including recommendation systems and industrial IoT.