Static Word Embeddings

Static word embeddings represent words as numerical vectors, aiming to capture semantic relationships between words independent of context. Current research focuses on improving these embeddings by incorporating additional information, such as multilingual graph knowledge, temporal dynamics, and domain-specific corpora, often using algorithms like GloVe and exploring methods to mitigate biases inherent in these representations. This work is significant because accurate and unbiased word embeddings are fundamental to many natural language processing tasks, impacting applications ranging from sentiment analysis and bias detection to cross-lingual information retrieval and social science research.

Papers