Semi Supervised Learning
Semi-supervised learning (SSL) aims to improve machine learning model accuracy by leveraging both limited labeled and abundant unlabeled data. Current research focuses on refining pseudo-labeling techniques to reduce noise and bias in unlabeled data, employing teacher-student models and contrastive learning, and developing novel algorithms to effectively utilize all available unlabeled samples, including those from open sets or with imbalanced class distributions. These advancements are significant because they reduce the reliance on expensive and time-consuming manual labeling, thereby expanding the applicability of machine learning to diverse domains with limited annotated data.
Papers
October 4, 2023
October 2, 2023
September 29, 2023
September 28, 2023
September 27, 2023
September 26, 2023
September 25, 2023
September 24, 2023
September 18, 2023
September 13, 2023
September 12, 2023
September 11, 2023
September 9, 2023
September 7, 2023
August 31, 2023
August 30, 2023