Lexical Semantics

Lexical semantics investigates how word meanings are represented and processed, focusing on understanding the relationships between words and their contexts. Current research emphasizes leveraging large language models (LLMs) and neural networks to create distributed word representations that capture nuanced contextual meanings, including ambiguity and polysemy, often employing techniques like self-attention and deep metric learning to improve semantic similarity calculations. This work is crucial for advancing natural language processing tasks like word sense disambiguation, machine translation, and the development of more human-like language understanding in AI systems. Furthermore, studies are exploring the interplay between lexical semantics and other linguistic features, such as word order and grammatical roles, to refine models and improve their accuracy.

Papers