Client Dropout
Client dropout, the unpredictable unavailability of participating devices in federated learning (FL), significantly hinders the convergence and accuracy of collaboratively trained models. Current research focuses on mitigating this issue through algorithmic improvements, such as employing techniques that mimic missing updates or substitute them with data from similar clients, and by developing robust algorithms that are less sensitive to client participation variability. These efforts aim to improve the reliability and efficiency of FL, particularly in resource-constrained environments like mobile edge networks, and are crucial for expanding the practical applications of this privacy-preserving machine learning paradigm.
Papers
December 16, 2024
June 21, 2023
May 16, 2023
March 11, 2023
August 14, 2022
July 18, 2022