Normalizer Free
Normalizer-free networks aim to eliminate the need for normalization layers, such as batch normalization (BN), which are computationally expensive and can hinder model interpretability and generalization. Current research focuses on developing alternative architectures and training techniques that achieve comparable or superior performance without normalization, including modifications to residual blocks and the introduction of novel components like "NoMorelization." This research is significant because it addresses limitations of traditional normalization layers, potentially leading to faster, more efficient, and more robust deep learning models for various applications.
Papers
March 6, 2024
October 13, 2022
July 4, 2022
March 21, 2022
December 23, 2021