Neural Architecture Search
Neural Architecture Search (NAS) automates the design of optimal neural network architectures, aiming to replace the time-consuming and often suboptimal process of manual design. Current research focuses on improving efficiency, exploring various search algorithms (including reinforcement learning, evolutionary algorithms, and gradient-based methods), and developing effective zero-cost proxies to reduce computational demands. This field is significant because it promises to accelerate the development of high-performing models across diverse applications, from image recognition and natural language processing to resource-constrained environments like microcontrollers and in-memory computing.
Papers
September 5, 2024
September 1, 2024
August 30, 2024
August 28, 2024
August 26, 2024
August 23, 2024
August 21, 2024
August 16, 2024
August 11, 2024
August 10, 2024
August 8, 2024
August 2, 2024
August 1, 2024
July 30, 2024
July 29, 2024
July 26, 2024
July 22, 2024
July 18, 2024
July 5, 2024