Federated Learning
Federated learning (FL) is a decentralized machine learning approach enabling collaborative model training across multiple devices without directly sharing their data, thereby preserving privacy. Current research focuses on addressing challenges like data heterogeneity (non-IID data), communication efficiency (e.g., using scalar updates or spiking neural networks), and robustness to adversarial attacks or concept drift, often employing techniques such as knowledge distillation, James-Stein estimators, and adaptive client selection. FL's significance lies in its potential to unlock the power of massive, distributed datasets for training sophisticated models while adhering to privacy regulations and ethical considerations, with applications spanning healthcare, IoT, and other sensitive domains.
Papers
FLAME: Adaptive and Reactive Concept Drift Mitigation for Federated Learning Deployments
Ioannis Mavromatis, Stefano De Feo, Aftab Khan
A Federated Learning Platform as a Service for Advancing Stroke Management in European Clinical Centers
Diogo Reis Santos, Albert Sund Aillet, Antonio Boiano, Usevalad Milasheuski, Lorenzo Giusti, Marco Di Gennaro, Sanaz Kianoush, Luca Barbieri, Monica Nicoli, Michele Carminati, Alessandro E. C. Redondi, Stefano Savazzi, Luigi Serio
Comments on "Privacy-Enhanced Federated Learning Against Poisoning Adversaries"
Thomas Schneider, Ajith Suresh, Hossein Yalame
Leveraging Pre-trained Models for Robust Federated Learning for Kidney Stone Type Recognition
Ivan Reyes-Amezcua, Michael Rojas-Ruiz, Gilberto Ochoa-Ruiz, Andres Mendez-Vazquez, Christian Daul
HYDRA-FL: Hybrid Knowledge Distillation for Robust and Accurate Federated Learning
Momin Ahmad Khan, Yasra Chandio, Fatima Muhammad Anwar
Flight: A FaaS-Based Framework for Complex and Hierarchical Federated Learning
Nathaniel Hudson, Valerie Hayot-Sasson, Yadu Babuji, Matt Baughman, J. Gregory Pauloski, Ryan Chard, Ian Foster, Kyle Chard
Communication and Energy Efficient Federated Learning using Zero-Order Optimization Technique
Elissa Mhanna, Mohamad Assaad
Federated Large Language Models: Current Progress and Future Directions
Yuhang Yao, Jianyi Zhang, Junda Wu, Chengkai Huang, Yu Xia, Tong Yu, Ruiyi Zhang, Sungchul Kim, Ryan Rossi, Ang Li, Lina Yao, Julian McAuley, Yiran Chen, Carlee Joe-Wong
FLeNS: Federated Learning with Enhanced Nesterov-Newton Sketch
Sunny Gupta, Mohit, Pankhi Kashyap, Pranav Jeevan, Amit Sethi
Energy-Aware Federated Learning in Satellite Constellations
Nasrin Razmi, Bho Matthiesen, Armin Dekorsy, Petar Popovski
FedGCA: Global Consistent Augmentation Based Single-Source Federated Domain Generalization
Yuan Liu, Shu Wang, Zhe Qu, Xingyu Li, Shichao Kan, Jianxin Wang
Flotta: a Secure and Flexible Spark-inspired Federated Learning Framework
Claudio Bonesana, Daniele Malpetti, Sandra Mitrović, Francesca Mangili, Laura Azzimonti
Global Outlier Detection in a Federated Learning Setting with Isolation Forest
Daniele Malpetti, Laura Azzimonti
Noise-Robust and Resource-Efficient ADMM-based Federated Learning
Ehsan Lari, Reza Arablouei, Vinay Chakravarthi Gogineni, Stefan Werner