Contrastive Method
Contrastive methods are a family of self-supervised learning techniques aiming to learn robust data representations by maximizing the similarity between different augmented views of the same data point while minimizing similarity between different data points. Current research focuses on applying contrastive learning to diverse domains, including graph anomaly detection, domain adaptation, and multimodal learning, often employing architectures like autoencoders, Siamese networks, and Vision Transformers. These methods are proving valuable for improving model performance in scenarios with limited labeled data, enhancing generalization across domains, and enabling efficient knowledge transfer, with applications ranging from medical image analysis to large language model fine-tuning.
Papers
How to Bridge Spatial and Temporal Heterogeneity in Link Prediction? A Contrastive Method
Yu Tai, Xinglong Wu, Hongwei Yang, Hui He, Duanjing Chen, Yuanming Shao, Weizhe Zhang
A Multi-Granularity Supervised Contrastive Framework for Remaining Useful Life Prediction of Aero-engines
Zixuan He, Ziqian Kong, Zhengyu Chen, Yuling Zhan, Zijun Que, Zhengguo Xu