Supervised Contrastive Learning
Supervised contrastive learning (SCL) is a machine learning technique aiming to learn robust and discriminative feature representations by pulling together similar data points (positives) and pushing apart dissimilar ones (negatives), all guided by labeled data. Current research focuses on applying SCL to diverse tasks, including image classification, natural language processing, and time-series analysis, often incorporating techniques like data augmentation and novel loss functions to address challenges such as imbalanced datasets and label noise. The effectiveness of SCL in improving model performance and generalizability across various domains makes it a significant area of research with broad implications for numerous applications, from medical image analysis to fraud detection.
Papers
Exploiting Contrastive Learning and Numerical Evidence for Confusing Legal Judgment Prediction
Leilei Gan, Baokui Li, Kun Kuang, Yating Zhang, Lei Wang, Luu Anh Tuan, Yi Yang, Fei Wu
DeepRGVP: A Novel Microstructure-Informed Supervised Contrastive Learning Framework for Automated Identification Of The Retinogeniculate Pathway Using dMRI Tractography
Sipei Li, Jianzhong He, Tengfei Xue, Guoqiang Xie, Shun Yao, Yuqian Chen, Erickson F. Torio, Yuanjing Feng, Dhiego CA Bastos, Yogesh Rathi, Nikos Makris, Ron Kikinis, Wenya Linda Bi, Alexandra J Golby, Lauren J O'Donnell, Fan Zhang