Consistency Regularization
Consistency regularization is a technique in machine learning that improves model generalization and robustness by enforcing consistent predictions across different, augmented versions of the same input data. Current research focuses on applying this technique to various tasks, including image segmentation, speech recognition, and natural language processing, often integrating it with architectures like convolutional neural networks (CNNs), transformers, and connectionist temporal classification (CTC). The widespread adoption of consistency regularization stems from its effectiveness in mitigating overfitting, particularly in scenarios with limited labeled data or significant domain shifts, leading to improved performance and reliability in diverse applications.
Papers
Judge Like a Real Doctor: Dual Teacher Sample Consistency Framework for Semi-supervised Medical Image Classification
Zhang Qixiang, Yang Yuxiang, Zu Chen, Zhang Jianjia, Wu Xi, Zhou Jiliu, Wang Yan
SPACE: SPAtial-aware Consistency rEgularization for anomaly detection in Industrial applications
Daehwan Kim, Hyungmin Kim, Daun Jeong, Sungho Suh, Hansang Cho