Federated Multi Task Learning

Federated Multi-Task Learning (FMTL) combines federated learning's privacy-preserving distributed training with multi-task learning's ability to leverage shared information across related tasks. Current research emphasizes efficient resource management across multiple simultaneous FMTL jobs, addressing challenges like heterogeneous data distributions, model incongruity among clients, and optimizing for fairness and reduced communication overhead. This approach is significant for improving the efficiency and scalability of machine learning in resource-constrained environments, particularly in edge computing and applications requiring personalized models trained on decentralized data.

Papers