Sparse Network Training
Sparse network training aims to develop efficient deep learning models by reducing the number of parameters while maintaining or improving performance. Current research focuses on methods that encourage sparsity during training, including techniques that leverage inter-layer feature similarity and those that generate and combine multiple sparse subnetworks. This area is significant because it promises to reduce computational costs, energy consumption, and memory requirements for deep learning, making it more accessible and environmentally friendly while potentially improving model generalization.
Papers
July 14, 2023
March 30, 2023
February 6, 2023
May 30, 2022
April 4, 2022