Multi Task Learning
Multi-task learning (MTL) aims to improve model efficiency and generalization by training a single model to perform multiple related tasks simultaneously. Current research focuses on addressing challenges like task interference and optimization difficulties, exploring architectures such as Mixture-of-Experts (MoE), low-rank adaptors, and hierarchical models to enhance performance and efficiency across diverse tasks. MTL's significance lies in its potential to improve resource utilization and create more robust and adaptable AI systems, with applications spanning various fields including natural language processing, computer vision, and scientific modeling.
Papers
Tissue Concepts: supervised foundation models in computational pathology
Till Nicke, Jan Raphael Schaefer, Henning Hoefener, Friedrich Feuerhake, Dorit Merhof, Fabian Kiessling, Johannes Lotz
SpinMultiNet: Neural Network Potential Incorporating Spin Degrees of Freedom with Multi-Task Learning
Koki Ueno, Satoru Ohuchi, Kazuhide Ichikawa, Kei Amii, Kensuke Wakasugi
NeuroLM: A Universal Multi-task Foundation Model for Bridging the Gap between Language and EEG Signals
Wei-Bang Jiang, Yansen Wang, Bao-Liang Lu, Dongsheng Li
Query-by-Example Keyword Spotting Using Spectral-Temporal Graph Attentive Pooling and Multi-Task Learning
Zhenyu Wang, Shuyu Kong, Li Wan, Biqiao Zhang, Yiteng Huang, Mumin Jin, Ming Sun, Xin Lei, Zhaojun Yang
Can Optimization Trajectories Explain Multi-Task Transfer?
David Mueller, Mark Dredze, Nicholas Andrews
Detailed delineation of the fetal brain in diffusion MRI via multi-task learning
Davood Karimi, Camilo Calixto, Haykel Snoussi, Maria Camila Cortes-Albornoz, Clemente Velasco-Annis, Caitlin Rollins, Camilo Jaimes, Ali Gholipour, Simon K. Warfield
Dynamic Adaptive Optimization for Effective Sentiment Analysis Fine-Tuning on Large Language Models
Hongcheng Ding, Xuanze Zhao, Shamsul Nahar Abdullah, Deshinta Arrova Dewi, Zixiao Jiang, Xiangyu Shi
Analytical Uncertainty-Based Loss Weighting in Multi-Task Learning
Lukas Kirchdorfer, Cathrin Elich, Simon Kutsche, Heiner Stuckenschmidt, Lukas Schott, Jan M. Köhler