Decay Regularization
Decay regularization, a technique used to improve the generalization and stability of neural networks, is a focus of current research aiming to enhance model efficiency and robustness. Studies explore its impact on various architectures, including convolutional neural networks, recurrent neural networks, and transformers, often focusing on how different forms of decay regularization (e.g., weight decay, norm control) affect training dynamics and the resulting model properties. This research is significant because it addresses challenges in deploying large neural networks on resource-constrained devices and improving the reliability and performance of models across diverse datasets and tasks.
Papers
October 31, 2024
May 6, 2024
November 19, 2023
November 18, 2023
November 10, 2023
June 29, 2023
May 25, 2023
January 23, 2023
January 16, 2023
December 16, 2022
September 2, 2022