Decentralized Framework
Decentralized frameworks aim to distribute computation and data across multiple nodes, avoiding the single point of failure and scalability limitations of centralized systems. Current research focuses on improving communication efficiency in these frameworks, exploring various network topologies (e.g., ring, mesh) and algorithms like federated learning with gradient tracking and model aggregation techniques to handle data heterogeneity and incomplete classes. This approach is particularly relevant for applications requiring data privacy, such as federated learning in IoT and AI inference, and offers significant advantages in scalability and robustness compared to centralized alternatives.
Papers
October 24, 2024
August 6, 2024
May 30, 2024
April 14, 2024
November 30, 2023
November 27, 2023
October 4, 2023
July 19, 2023
October 4, 2022
August 10, 2022
July 28, 2022
February 24, 2022