Paper ID: 2406.02447 • Published Jun 4, 2024
Federated Class-Incremental Learning with Hierarchical Generative Prototypes
Riccardo Salami, Pietro Buzzega, Matteo Mosconi, Mattia Verasani, Simone Calderara
TL;DR
Get AI-generated summaries with premium
Get AI-generated summaries with premium
Federated Learning (FL) aims at unburdening the training of deep models by
distributing computation across multiple devices (clients) while safeguarding
data privacy. On top of that, Federated Continual Learning (FCL) also accounts
for data distribution evolving over time, mirroring the dynamic nature of
real-world environments. While previous studies have identified Catastrophic
Forgetting and Client Drift as primary causes of performance degradation in
FCL, we shed light on the importance of Incremental Bias and Federated Bias,
which cause models to prioritize classes that are recently introduced or
locally predominant, respectively. Our proposal constrains both biases in the
last layer by efficiently finetuning a pre-trained backbone using learnable
prompts, resulting in clients that produce less biased representations and more
biased classifiers. Therefore, instead of solely relying on parameter
aggregation, we leverage generative prototypes to effectively balance the
predictions of the global model. Our method significantly improves the current
State Of The Art, providing an average increase of +7.8% in accuracy. Code to
reproduce the results is provided in the suppl. material.