Mixup Loss
Mixup loss is a data augmentation technique that improves the robustness and generalization of deep neural networks by creating synthetic training examples through linear interpolation of input data and their corresponding labels. Current research focuses on understanding the underlying mechanisms of mixup's success, particularly its influence on the geometric configuration of learned representations and its application in various learning paradigms, including multimodal learning, speech enhancement, and unsupervised domain adaptation. This technique shows promise in enhancing model calibration and performance across diverse tasks, impacting fields ranging from image classification to speech processing.
Papers
September 26, 2024
February 9, 2024
December 20, 2023
May 17, 2023
March 23, 2023
October 28, 2022