Batchnorm Minus Implementation

"BatchNorm minus" research explores modifications and alternatives to standard Batch Normalization (BatchNorm) in deep neural networks, aiming to improve efficiency, robustness, and training stability. Current efforts focus on re-parameterizing BatchNorm for efficient inference in Transformers and CNNs, adapting its statistics for improved robustness to noisy data, and leveraging its properties for structured pruning and neural architecture search. These investigations are significant because they address computational bottlenecks in deploying large models and enhance the resilience and performance of neural networks across diverse hardware and data conditions.

Papers