Batch Normalization

Batch normalization (BN) is a widely used technique in deep neural networks that normalizes the activations of each layer to improve training stability and generalization. Current research focuses on addressing BN's limitations in diverse settings, such as federated learning (where data is distributed across multiple devices) and test-time adaptation (where models must adapt to unseen data distributions), often employing techniques like adaptive BN, variance reduction algorithms, and alternative normalization methods (e.g., group normalization, instance normalization). Overcoming these limitations is crucial for improving the robustness and efficiency of deep learning models across various applications, including medical image analysis and speaker recognition.

Papers