Self Supervised Learning
Self-supervised learning (SSL) aims to train machine learning models using unlabeled data by formulating pretext tasks that encourage the model to learn useful representations. Current research focuses on improving SSL's performance and generalization across diverse data types (images, audio, graphs, point clouds) and downstream tasks, employing techniques like contrastive learning, masked autoencoders, and generative models within various architectures such as transformers and convolutional neural networks. These advancements are significant because they reduce the reliance on expensive and time-consuming data labeling, enabling the development of robust models for applications ranging from medical image analysis and speech recognition to geospatial AI and protein function prediction. The efficiency gains from SSL are also a key focus, with research exploring optimal model and data sizes for given computational budgets.
Papers - Page 18
Re-Simulation-based Self-Supervised Learning for Pre-Training Foundation Models
Philip Harris, Michael Kagan, Jeffrey Krupa, Benedikt Maier, Nathaniel WoodwardJoint-Embedding Masked Autoencoder for Self-supervised Learning of Dynamic Functional Connectivity from the Human Brain
Jungwon Choi, Hyungi Lee, Byung-Hoon Kim, Juho Lee
Augmentations vs Algorithms: What Works in Self-Supervised Learning
Warren Morningstar, Alex Bijamov, Chris Duvarney, Luke Friedman, Neha Kalibhat, Luyang Liu, Philip Mansfield, Renan Rojas-Gomez, Karan Singhal+2SIRST-5K: Exploring Massive Negatives Synthesis with Self-supervised Learning for Robust Infrared Small Target Detection
Yahao Lu, Yupei Lin, Han Wu, Xiaoyu Xian, Yukai Shi, Liang LinSelf-Supervised Multiple Instance Learning for Acute Myeloid Leukemia Classification
Salome Kazeminia, Max Joosten, Dragan Bosnacki, Carsten Marr
Self-Supervision in Time for Satellite Images(S3-TSS): A novel method of SSL technique in Satellite images
Akansh Maurya, Hewan Shrestha, Mohammad Munem ShahriarReducing self-supervised learning complexity improves weakly-supervised classification performance in computational pathology
Tim Lenz, Omar S. M. El Nahhas, Marta Ligero, Jakob Nikolas Kather
Low-Res Leads the Way: Improving Generalization for Super-Resolution by Self-Supervised Learning
Haoyu Chen, Wenbo Li, Jinjin Gu, Jingjing Ren, Haoze Sun, Xueyi Zou, Zhensong Zhang, Youliang Yan, Lei ZhuPooling Image Datasets With Multiple Covariate Shift and Imbalance
Sotirios Panagiotis Chytas, Vishnu Suresh Lokhande, Peiran Li, Vikas Singh
Kick Back & Relax++: Scaling Beyond Ground-Truth Depth with SlowTV & CribsTV
Jaime Spencer, Chris Russell, Simon Hadfield, Richard BowdenSelf-Supervised Representation Learning with Meta Comprehensive Regularization
Huijie Guo, Ying Ba, Jie Hu, Lingyu Si, Wenwen Qiang, Lei ShiApplying Self-supervised Learning to Network Intrusion Detection for Network Flows with Graph Neural Network
Renjie Xu, Guangwei Wu, Weiping Wang, Xing Gao, An He, Zhengpeng Zhang