Contrastive Learning
Contrastive learning is a self-supervised machine learning technique that aims to learn robust data representations by contrasting similar and dissimilar data points. Current research focuses on applying contrastive learning to diverse modalities, including images, audio, text, and time-series data, often within multimodal frameworks and using architectures like MoCo and SimCLR, and exploring its application in various tasks such as object detection, speaker verification, and image dehazing. This approach is significant because it allows for effective learning from unlabeled or weakly labeled data, improving model generalization and performance across numerous applications, particularly in scenarios with limited annotated data or significant domain shifts.
Papers
Length is a Curse and a Blessing for Document-level Semantics
Chenghao Xiao, Yizhi Li, G Thomas Hudson, Chenghua Lin, Noura Al Moubayed
Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words
Hiroto Kurita, Goro Kobayashi, Sho Yokoi, Kentaro Inui
Unpaired MRI Super Resolution with Contrastive Learning
Hao Li, Quanwei Liu, Jianan Liu, Xiling Liu, Yanni Dong, Tao Huang, Zhihan Lv
CONTRASTE: Supervised Contrastive Pre-training With Aspect-based Prompts For Aspect Sentiment Triplet Extraction
Rajdeep Mukherjee, Nithish Kannen, Saurabh Kumar Pandey, Pawan Goyal
I$^2$MD: 3D Action Representation Learning with Inter- and Intra-modal Mutual Distillation
Yunyao Mao, Jiajun Deng, Wengang Zhou, Zhenbo Lu, Wanli Ouyang, Houqiang Li
Generative and Contrastive Paradigms Are Complementary for Graph Self-Supervised Learning
Yuxiang Wang, Xiao Yan, Chuang Hu, Fangcheng Fu, Wentao Zhang, Hao Wang, Shuo Shang, Jiawei Jiang
Spectral-Aware Augmentation for Enhanced Graph Representation Learning
Kaiqi Yang, Haoyu Han, Wei Jin, Hui Liu
DistillCSE: Distilled Contrastive Learning for Sentence Embeddings
Jiahao Xu, Wei Shao, Lihui Chen, Lemao Liu
Enhancing drug and cell line representations via contrastive learning for improved anti-cancer drug prioritization
Patrick J. Lawrence, Xia Ning
Multi-level Contrastive Learning for Script-based Character Understanding
Dawei Li, Hengyuan Zhang, Yanran Li, Shiping Yang
Towards Understanding How Transformers Learn In-context Through a Representation Learning Lens
Ruifeng Ren, Yong Liu
Contrastive Learning for Inference in Dialogue
Etsuko Ishii, Yan Xu, Bryan Wilie, Ziwei Ji, Holy Lovenia, Willy Chung, Pascale Fung
WeedCLR: Weed Contrastive Learning through Visual Representations with Class-Optimized Loss in Long-Tailed Datasets
Alzayat Saleh, Alex Olsen, Jake Wood, Bronson Philippa, Mostafa Rahimi Azghadi