Sentence Representation
Sentence representation focuses on encoding the meaning of sentences into numerical vectors, enabling computers to understand and process natural language. Current research emphasizes improving the quality of these representations through techniques like contrastive learning, which learns by comparing sentence pairs, and incorporating both sentence-level and token-level information for more nuanced understanding. This field is crucial for advancements in various NLP tasks, including semantic similarity assessment, text summarization, and question answering, impacting both scientific understanding of language and the development of practical applications.
Papers
January 24, 2024
January 7, 2024
December 15, 2023
November 29, 2023
November 6, 2023
October 29, 2023
October 23, 2023
October 17, 2023
September 17, 2023
September 12, 2023
September 8, 2023
August 7, 2023
July 20, 2023
July 6, 2023
June 4, 2023
June 2, 2023
June 1, 2023
May 31, 2023
May 26, 2023