Word Vector
Word vectors are numerical representations of words, capturing semantic meaning and relationships within a vector space. Current research focuses on improving efficiency by reducing vector dimensionality, exploring different model architectures like GloVe, CNN-BiLSTM, and transformers for enhanced performance in tasks such as text classification and zero-shot learning, and investigating methods to improve the quality and robustness of word vector generation, including the use of large language models. These advancements have significant implications for various natural language processing applications, including improved text classification, information retrieval, and cross-lingual tasks.
Papers
November 13, 2024
November 6, 2024
July 17, 2024
October 25, 2023
October 18, 2023
July 27, 2023
May 23, 2023
May 19, 2023
May 7, 2023
January 1, 2023
December 1, 2022
November 26, 2022
November 8, 2022
October 28, 2022
October 24, 2022
October 12, 2022
May 9, 2022
April 30, 2022