Federated Multi Task Learning
Federated Multi-Task Learning (FMTL) combines federated learning's privacy-preserving distributed training with multi-task learning's ability to leverage shared information across related tasks. Current research emphasizes efficient resource management across multiple simultaneous FMTL jobs, addressing challenges like heterogeneous data distributions, model incongruity among clients, and optimizing for fairness and reduced communication overhead. This approach is significant for improving the efficiency and scalability of machine learning in resource-constrained environments, particularly in edge computing and applications requiring personalized models trained on decentralized data.
Papers
October 20, 2024
April 11, 2024
February 20, 2024
January 5, 2024
December 13, 2023
November 22, 2023
July 21, 2023
June 30, 2023
January 1, 2023
November 24, 2022
July 13, 2022