Self Supervised Learning Framework
Self-supervised learning (SSL) frameworks aim to train machine learning models using unlabeled data, overcoming the limitations of relying solely on expensive and time-consuming labeled datasets. Current research focuses on adapting various architectures, including autoencoders, transformers, and Siamese networks, to diverse data modalities such as images, videos, point clouds, and time series, often employing contrastive learning or masked prediction strategies. This approach is significantly impacting fields like computer vision, anomaly detection, and recommendation systems by enabling the development of robust models with improved performance, particularly in low-resource settings where labeled data is scarce.
Papers
August 4, 2024
May 22, 2024
May 20, 2024
April 24, 2024
January 5, 2024
January 1, 2024
December 7, 2023
November 20, 2023
October 6, 2023
September 2, 2023
August 18, 2023
August 10, 2023
August 8, 2023
May 1, 2023
April 22, 2023
April 12, 2023
March 29, 2023
February 3, 2023