Static Word Embeddings
Static word embeddings represent words as numerical vectors, aiming to capture semantic relationships between words independent of context. Current research focuses on improving these embeddings by incorporating additional information, such as multilingual graph knowledge, temporal dynamics, and domain-specific corpora, often using algorithms like GloVe and exploring methods to mitigate biases inherent in these representations. This work is significant because accurate and unbiased word embeddings are fundamental to many natural language processing tasks, impacting applications ranging from sentiment analysis and bias detection to cross-lingual information retrieval and social science research.
Papers
May 13, 2023
December 11, 2022
November 15, 2022
October 30, 2022
October 21, 2022
September 11, 2022
April 25, 2022
March 25, 2022
March 17, 2022
November 23, 2021