Consistency Regularization
Consistency regularization is a technique in machine learning that improves model generalization and robustness by enforcing consistent predictions across different, augmented versions of the same input data. Current research focuses on applying this technique to various tasks, including image segmentation, speech recognition, and natural language processing, often integrating it with architectures like convolutional neural networks (CNNs), transformers, and connectionist temporal classification (CTC). The widespread adoption of consistency regularization stems from its effectiveness in mitigating overfitting, particularly in scenarios with limited labeled data or significant domain shifts, leading to improved performance and reliability in diverse applications.
Papers
When Rigidity Hurts: Soft Consistency Regularization for Probabilistic Hierarchical Time Series Forecasting
Harshavardhan Kamarthi, Lingkai Kong, Alexander RodrÃguez, Chao Zhang, B. Aditya Prakash
Domain Generalization via Selective Consistency Regularization for Time Series Classification
Wenyu Zhang, Mohamed Ragab, Chuan-Sheng Foo