Mixup Regularization

Mixup regularization is a data augmentation technique that improves the generalization of deep learning models by creating synthetic training examples through convex combinations of existing data points and their labels. Current research focuses on refining mixup methods, exploring variations like UMAP Mixup for manifold-based augmentation and Semantic Equivariant Mixup for leveraging richer semantic information, and integrating mixup with other techniques such as Sharpness-Aware Minimization to achieve flatter minima and improved robustness. These advancements enhance model performance, particularly in scenarios with limited data or domain shifts, impacting various applications from image classification and object recognition to domain adaptation and few-shot learning.

Papers