Self Supervised Learning Framework
Self-supervised learning (SSL) frameworks aim to train machine learning models using unlabeled data, overcoming the limitations of relying solely on expensive and time-consuming labeled datasets. Current research focuses on adapting various architectures, including autoencoders, transformers, and Siamese networks, to diverse data modalities such as images, videos, point clouds, and time series, often employing contrastive learning or masked prediction strategies. This approach is significantly impacting fields like computer vision, anomaly detection, and recommendation systems by enabling the development of robust models with improved performance, particularly in low-resource settings where labeled data is scarce.
Papers
November 9, 2022
October 3, 2022
June 15, 2022
May 10, 2022
April 14, 2022
April 2, 2022
March 29, 2022
February 8, 2022
February 3, 2022