Contrastive Sentence
Contrastive sentence embedding focuses on learning high-quality vector representations of sentences by contrasting similar and dissimilar sentence pairs. Current research emphasizes improving the efficiency and effectiveness of these methods, exploring techniques like data augmentation (including leveraging large language models), novel loss functions (e.g., those based on gradient analysis or optimal transport), and knowledge distillation to create smaller, faster models. These advancements are crucial for improving various downstream NLP tasks, such as semantic textual similarity, information retrieval, and text classification, particularly in resource-constrained settings or when dealing with noisy data.
Papers
September 19, 2024
February 28, 2024
February 23, 2024
January 27, 2024
December 18, 2023
October 26, 2023
October 24, 2023
October 17, 2023
September 12, 2023
May 2, 2023
April 18, 2023
March 10, 2023
December 18, 2022
October 29, 2022
October 8, 2022
February 26, 2022