Paper ID: 2205.10920
Test-Time Robust Personalization for Federated Learning
Liangze Jiang, Tao Lin
Federated Learning (FL) is a machine learning paradigm where many clients collaboratively learn a shared global model with decentralized training data. Personalized FL additionally adapts the global model to different clients, achieving promising results on consistent local training and test distributions. However, for real-world personalized FL applications, it is crucial to go one step further: robustifying FL models under the evolving local test set during deployment, where various distribution shifts can arise. In this work, we identify the pitfalls of existing works under test-time distribution shifts and propose Federated Test-time Head Ensemble plus tuning(FedTHE+), which personalizes FL models with robustness to various test-time distribution shifts. We illustrate the advancement of FedTHE+ (and its computationally efficient variant FedTHE) over strong competitors, by training various neural architectures (CNN, ResNet, and Transformer) on CIFAR10 andImageNet with various test distributions. Along with this, we build a benchmark for assessing the performance and robustness of personalized FL methods during deployment. Code: https://github.com/LINs-lab/FedTHE.
Submitted: May 22, 2022