Word Semantics

Word semantics research focuses on understanding how meaning is represented and processed in language, aiming to improve computational models' ability to interpret and generate text. Current research emphasizes leveraging large language models (LLMs) and exploring techniques like disentangled representation learning to better capture nuanced semantic information within word embeddings, often focusing on improvements in handling phrases and resolving ambiguities. These advancements have implications for various natural language processing tasks, including sentiment analysis, machine translation, and even medical applications like analyzing handwriting in neurodegenerative diseases, by improving the accuracy and robustness of language understanding systems.

Papers