Differentiable Architecture Search
Differentiable architecture search (DARTS) automates the design of neural networks by using gradient-based optimization to find optimal network architectures within a continuous search space. Current research focuses on improving DARTS' stability, efficiency, and robustness, addressing issues like performance collapse and the selection of optimal operations, often through novel regularization techniques, adaptive learning rate scheduling, and the incorporation of sparse operations or generative models. This automated architecture design process has significant implications for accelerating the development of high-performing neural networks across various applications, reducing the reliance on manual design and expert knowledge.
Papers
December 31, 2022
November 18, 2022
September 21, 2022
August 30, 2022
August 18, 2022
August 10, 2022
July 14, 2022
June 20, 2022
May 19, 2022
May 12, 2022
April 20, 2022
April 10, 2022
March 3, 2022
January 27, 2022
January 26, 2022
November 30, 2021
November 25, 2021
November 6, 2021