Dropout Mask
Dropout masks, used in neural networks to regularize training and improve generalization, are a focus of current research aiming to optimize their generation and application. Recent work explores learning dropout masks using generative models like GFlowNets, adapting masks dynamically using particle filters, and basing mask selection on gradient signal-to-noise ratios to enhance robustness and domain generalization. These advancements improve model performance across diverse applications, including image classification (e.g., in medical imaging), natural language processing, and robotics, by addressing issues like overfitting, uncertainty estimation, and model adaptation to changing environments.
Papers
January 21, 2024
October 11, 2023
July 14, 2023
October 27, 2022
October 24, 2022