Differentiable Architecture Search
Differentiable architecture search (DARTS) automates the design of neural networks by using gradient-based optimization to find optimal network architectures within a continuous search space. Current research focuses on improving DARTS' stability, efficiency, and robustness, addressing issues like performance collapse and the selection of optimal operations, often through novel regularization techniques, adaptive learning rate scheduling, and the incorporation of sparse operations or generative models. This automated architecture design process has significant implications for accelerating the development of high-performing neural networks across various applications, reducing the reliance on manual design and expert knowledge.
Papers
OStr-DARTS: Differentiable Neural Architecture Search based on Operation Strength
Le Yang, Ziwei Zheng, Yizeng Han, Shiji Song, Gao Huang, Fan Li
EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition
Huafeng Qin, Hongyu Zhu, Xin Jin, Xin Yu, Mounim A. El-Yacoubi, Xinbo Gao