Syntactic Dependency
Syntactic dependency analysis focuses on understanding the relationships between words in a sentence, aiming to represent the underlying grammatical structure. Current research emphasizes efficient algorithms for dependency length minimization, often employing graph neural networks (like Graph Attention Networks) and incorporating information from pre-trained language models (like BERT) to improve accuracy and efficiency in tasks such as parsing and coreference resolution. This work is crucial for advancing natural language processing applications, including machine translation, sentiment analysis, and question answering, by providing more robust and nuanced representations of sentence structure.
Papers
Improving Pre-trained Language Models with Syntactic Dependency Prediction Task for Chinese Semantic Error Recognition
Bo Sun, Baoxin Wang, Wanxiang Che, Dayong Wu, Zhigang Chen, Ting Liu
On the Role of Pre-trained Language Models in Word Ordering: A Case Study with BART
Zebin Ou, Meishan Zhang, Yue Zhang