Dropout Layer
Dropout layers are a regularization technique in neural networks that randomly ignore neurons during training, improving model generalization and preventing overfitting. Current research explores variations like layer-wise and channel-wise dropout, integrating dropout into different model architectures (e.g., transformers, diffusion models) and employing it for uncertainty quantification and robust training in resource-constrained environments. These advancements enhance model performance, robustness, and efficiency across diverse applications, including natural language processing, image classification, and multi-sensor fusion.
Papers
November 5, 2024
October 8, 2024
April 6, 2024
February 26, 2024
September 11, 2023
August 24, 2023
August 19, 2023
May 30, 2023
May 25, 2023
February 6, 2023
December 20, 2022
May 6, 2022
May 2, 2022
December 16, 2021