Semantic Structure

Semantic structure research focuses on understanding how meaning is organized and represented in language and other modalities, aiming to improve machine comprehension and generation. Current research heavily utilizes large language models (LLMs) and graph neural networks to analyze and generate semantic structures, often focusing on tasks like analogical reasoning, relation extraction, and bias detection in text. These advancements have implications for various applications, including improved natural language processing, more robust and interpretable AI systems, and enhanced understanding of human cognition.

Papers