Balanced Loss

Balanced loss functions aim to mitigate the negative effects of imbalanced datasets in machine learning, improving model performance and fairness across different classes or groups. Current research focuses on adapting these functions to various model architectures, including gradient boosting decision trees, mixture-of-experts models, and diffusion models, often incorporating techniques like dynamic weighting, entropy-based sampling, and curriculum learning to achieve better class balance. This work is significant because it addresses a pervasive problem in real-world applications where data is rarely uniformly distributed, leading to more robust and equitable machine learning models.

Papers