Federated Continual Learning
Federated Continual Learning (FCL) addresses the challenge of training machine learning models across decentralized devices while adapting to continuously evolving data streams, without compromising data privacy. Current research focuses on mitigating "catastrophic forgetting" – the loss of previously learned knowledge – through techniques like buffer-based gradient projection, generative replay using diffusion models, and personalized learning via multi-granularity prompts. These advancements are significant for enabling robust and privacy-preserving AI in dynamic real-world applications, such as personalized healthcare, robotics, and online security systems, where data is distributed and constantly changing.
Papers
July 10, 2023
June 16, 2023
June 6, 2023
April 14, 2023
April 7, 2023
March 13, 2023
February 25, 2023
December 4, 2022
October 26, 2022
October 20, 2022
October 12, 2022
July 17, 2022
May 31, 2022
April 23, 2022
March 24, 2022
January 14, 2022