Federated Learning
Federated learning (FL) is a decentralized machine learning approach enabling collaborative model training across multiple devices without directly sharing their data, thereby preserving privacy. Current research focuses on addressing challenges like data heterogeneity (non-IID data), communication efficiency (e.g., using scalar updates or spiking neural networks), and robustness to adversarial attacks or concept drift, often employing techniques such as knowledge distillation, James-Stein estimators, and adaptive client selection. FL's significance lies in its potential to unlock the power of massive, distributed datasets for training sophisticated models while adhering to privacy regulations and ethical considerations, with applications spanning healthcare, IoT, and other sensitive domains.
Papers
AutoDFL: A Scalable and Automated Reputation-Aware Decentralized Federated Learning
Meryem Malak Dif, Mouhamed Amine Bouchiha, Mourad Rabah, Yacine Ghamri-Doudane
VerifBFL: Leveraging zk-SNARKs for A Verifiable Blockchained Federated Learning
Ahmed Ayoub Bellachia, Mouhamed Amine Bouchiha, Yacine Ghamri-Doudane, Mourad Rabah
Federated Learning with Workload Reduction through Partial Training of Client Models and Entropy-Based Data Selection
Hongrui Shi, Valentin Radu, Po Yang
Accelerating Energy-Efficient Federated Learning in Cell-Free Networks with Adaptive Quantization
Afsaneh Mahmoudi, Ming Xiao, Emil Björnson
Blockchain-Empowered Cyber-Secure Federated Learning for Trustworthy Edge Computing
Ervin Moore, Ahmed Imteaj, Md Zarif Hossain, Shabnam Rezapour, M. Hadi Amini
FedGAT: A Privacy-Preserving Federated Approximation Algorithm for Graph Attention Networks
Siddharth Ambekar, Yuhang Yao, Ryan Li, Carlee Joe-Wong
fluke: Federated Learning Utility frameworK for Experimentation and research
Mirko Polato
DualGFL: Federated Learning with a Dual-Level Coalition-Auction Game
Xiaobing Chen, Xiangwei Zhou, Songyang Zhang, Mingxuan Sun
LoLaFL: Low-Latency Federated Learning via Forward-only Propagation
Jierui Zhang, Jianhao Huang, Kaibin Huang
FedPIA -- Permuting and Integrating Adapters leveraging Wasserstein Barycenters for Finetuning Foundation Models in Multi-Modal Federated Learning
Pramit Saha, Divyanshu Mishra, Felix Wagner, Konstantinos Kamnitsas, J. Alison Noble