Batch Normalization Layer
Batch normalization (BN) layers are a crucial component of deep neural networks, primarily aiming to stabilize training and improve model performance by normalizing the activations of each layer. Current research focuses on leveraging BN layers for improved adaptation in domain adaptation and continual learning scenarios, often by modifying or freezing BN statistics to mitigate catastrophic forgetting or enhance robustness to distribution shifts. This work is significant because effective BN layer management is critical for achieving efficient and accurate model training across diverse applications, including object detection, acoustic scene classification, and medical image analysis.
Papers
July 10, 2024
July 6, 2024
June 19, 2024
March 29, 2024
March 28, 2024
March 27, 2024
March 25, 2024
March 22, 2024
February 25, 2024
February 7, 2024
December 22, 2023
December 12, 2023
December 4, 2023
November 30, 2023
August 15, 2023
July 2, 2023
April 3, 2023
March 10, 2023