Sentence Embeddings
Sentence embeddings represent sentences as dense vectors, aiming to capture their semantic meaning for various natural language processing tasks. Current research focuses on improving embedding quality through techniques like contrastive learning, domain adaptation (especially for low-resource languages), and exploring the internal structure of embeddings to better understand how linguistic information is encoded. These advancements are significant because effective sentence embeddings are crucial for applications ranging from semantic search and text classification to machine translation and recommendation systems.
Papers
Apple of Sodom: Hidden Backdoors in Superior Sentence Embeddings via Contrastive Learning
Xiaoyi Chen, Baisong Xin, Shengfang Zhai, Shiqing Ma, Qingni Shen, Zhonghai Wu
Pre-trained Sentence Embeddings for Implicit Discourse Relation Classification
Murali Raghu Babu Balusu, Yangfeng Ji, Jacob Eisenstein