Neural Network Configuration
Neural network configuration focuses on optimizing the architecture and hyperparameters of neural networks to achieve optimal performance for specific tasks. Current research emphasizes efficient exploration of design spaces, including investigations into various architectures like recurrent neural networks (RNNs), convolutional neural networks (CNNs), and temporal graph neural networks (TGNs), and the impact of hyperparameters such as learning rate, batch size, and loss function on model accuracy and efficiency. These efforts are crucial for improving the performance and energy efficiency of neural networks across diverse applications, from medical image analysis to time-series forecasting.
Papers
April 17, 2024
May 2, 2023
December 2, 2022
October 3, 2022
June 30, 2022
June 28, 2022