Batchnorm Minus Implementation
"BatchNorm minus" research explores modifications and alternatives to standard Batch Normalization (BatchNorm) in deep neural networks, aiming to improve efficiency, robustness, and training stability. Current efforts focus on re-parameterizing BatchNorm for efficient inference in Transformers and CNNs, adapting its statistics for improved robustness to noisy data, and leveraging its properties for structured pruning and neural architecture search. These investigations are significant because they address computational bottlenecks in deploying large models and enhance the resilience and performance of neural networks across diverse hardware and data conditions.
Papers
May 19, 2024
October 31, 2023
June 8, 2023
May 28, 2023
March 22, 2023
April 13, 2022
December 1, 2021