Semantic Structure
Semantic structure research focuses on understanding how meaning is organized and represented in language and other modalities, aiming to improve machine comprehension and generation. Current research heavily utilizes large language models (LLMs) and graph neural networks to analyze and generate semantic structures, often focusing on tasks like analogical reasoning, relation extraction, and bias detection in text. These advancements have implications for various applications, including improved natural language processing, more robust and interpretable AI systems, and enhanced understanding of human cognition.
Papers
May 13, 2022
April 15, 2022
January 25, 2022
December 25, 2021
December 1, 2021
November 28, 2021