Paper ID: 2301.05919

Efficient Evaluation Methods for Neural Architecture Search: A Survey

Xiaotian Song, Xiangning Xie, Zeqiong Lv, Gary G. Yen, Weiping Ding, Jiancheng Lv, Yanan Sun

Neural Architecture Search (NAS) has received increasing attention because of its exceptional merits in automating the design of Deep Neural Network (DNN) architectures. However, the performance evaluation process, as a key part of NAS, often requires training a large number of DNNs. This inevitably makes NAS computationally expensive. In past years, many Efficient Evaluation Methods (EEMs) have been proposed to address this critical issue. In this paper, we comprehensively survey these EEMs published up to date, and provide a detailed analysis to motivate the further development of this research direction. Specifically, we divide the existing EEMs into four categories based on the number of DNNs trained for constructing these EEMs. The categorization can reflect the degree of efficiency in principle, which can in turn help quickly grasp the methodological features. In surveying each category, we further discuss the design principles and analyze the strengths and weaknesses to clarify the landscape of existing EEMs, thus making easily understanding the research trends of EEMs. Furthermore, we also discuss the current challenges and issues to identify future research directions in this emerging topic. In summary, this survey provides a convenient overview of EEM for interested users, and they can easily select the proper EEM method for the tasks at hand. In addition, the researchers in the NAS field could continue exploring the future directions suggested in the paper.

Submitted: Jan 14, 2023