Paper ID: 2205.00671
Jack and Masters of all Trades: One-Pass Learning Sets of Model Sets From Large Pre-Trained Models
Han Xiang Choong, Yew-Soon Ong, Abhishek Gupta, Caishun Chen, Ray Lim
For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These large pre-trained models or Jacks of All Trades (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments with tight resource constraints, changing objectives and intentions, or varied task requirements, could limit the real-world utility of a singular JAT. Hence, in tandem with current trends towards building increasingly large JATs, this paper conducts an initial exploration into concepts underlying the creation of a diverse set of compact machine learning model sets. Composed of many smaller and specialized models, the Set of Sets is formulated to simultaneously fulfil many task settings and environmental conditions. A means to arrive at such a set tractably in one pass of a neuroevolutionary multitasking algorithm is presented for the first time, bringing us closer to models that are collectively Masters of All Trades.
Submitted: May 2, 2022