Lexical Semantic Task
Lexical semantic tasks focus on understanding word meaning within context, aiming to improve how computers process and interpret human language. Current research heavily utilizes large language models (LLMs) and explores techniques like contrastive learning and self-attention mechanisms to enhance word representations, often leveraging resources such as WordNet and Wiktionary. These advancements are crucial for improving various natural language processing applications, including knowledge base completion and ontology learning, by bridging the gap between how humans and models understand contextual word meaning.
Papers
September 13, 2024
July 29, 2024
March 14, 2024
February 12, 2024