Static Word Embeddings
Static word embeddings represent words as numerical vectors, aiming to capture semantic relationships between words independent of context. Current research focuses on improving these embeddings by incorporating additional information, such as multilingual graph knowledge, temporal dynamics, and domain-specific corpora, often using algorithms like GloVe and exploring methods to mitigate biases inherent in these representations. This work is significant because accurate and unbiased word embeddings are fundamental to many natural language processing tasks, impacting applications ranging from sentiment analysis and bias detection to cross-lingual information retrieval and social science research.
Papers
September 28, 2024
September 26, 2024
September 14, 2024
June 19, 2024
June 18, 2024
May 17, 2024
May 13, 2024
February 12, 2024
February 1, 2024
January 9, 2024
December 11, 2023
October 28, 2023
October 21, 2023
September 19, 2023
September 5, 2023
June 2, 2023
May 29, 2023
May 25, 2023