Batch Normalization Layer
Batch normalization (BN) layers are a crucial component of deep neural networks, primarily aiming to stabilize training and improve model performance by normalizing the activations of each layer. Current research focuses on leveraging BN layers for improved adaptation in domain adaptation and continual learning scenarios, often by modifying or freezing BN statistics to mitigate catastrophic forgetting or enhance robustness to distribution shifts. This work is significant because effective BN layer management is critical for achieving efficient and accurate model training across diverse applications, including object detection, acoustic scene classification, and medical image analysis.
Papers
November 29, 2022
November 4, 2022
September 19, 2022
June 14, 2022
June 7, 2022
May 30, 2022
May 15, 2022
May 6, 2022
April 8, 2022
March 28, 2022
December 16, 2021