Decentralized Machine

Decentralized machine learning focuses on training models across distributed devices without centralizing data, prioritizing privacy and efficiency. Current research emphasizes developing robust algorithms for decentralized optimization, such as variations of stochastic gradient descent and Bayesian inference, and efficient communication strategies to handle heterogeneous networks and non-identically distributed data. This approach is significant for enabling large-scale machine learning on resource-constrained devices and in privacy-sensitive applications, impacting fields like federated learning and agent-based modeling.

Papers