Paper ID: 2202.08604
Two-stage architectural fine-tuning with neural architecture search using early-stopping in image classification
Youngkee Kim, Won Joon Yun, Youn Kyu Lee, Soyi Jung, Joongheon Kim
In many deep neural network (DNN) applications, the difficulty of gathering high-quality data in the industry field hinders the practical use of DNN. Thus, the concept of transfer learning has emerged, which leverages the pretrained knowledge of DNNs trained on large-scale datasets. Therefore, this paper suggests two-stage architectural fine-tuning, inspired by neural architecture search (NAS). One of main ideas is mutation, which reduces the search cost using given architectural information. Moreover, early-stopping is considered which cuts NAS costs by terminating the search process in advance. Experimental results verify our proposed method reduces 32.4% computational and 22.3% searching costs.
Submitted: Feb 17, 2022