Decentralized Framework

Decentralized frameworks aim to distribute computation and data across multiple nodes, avoiding the single point of failure and scalability limitations of centralized systems. Current research focuses on improving communication efficiency in these frameworks, exploring various network topologies (e.g., ring, mesh) and algorithms like federated learning with gradient tracking and model aggregation techniques to handle data heterogeneity and incomplete classes. This approach is particularly relevant for applications requiring data privacy, such as federated learning in IoT and AI inference, and offers significant advantages in scalability and robustness compared to centralized alternatives.

Papers