Dependency Treebanks

Dependency treebanks are structured corpora where sentences are annotated with syntactic relationships, forming tree-like structures that represent sentence grammar. Current research focuses on improving the creation of these treebanks, including developing efficient annotation tools (especially for agglutinative languages), harmonizing annotations across different treebanks and languages (e.g., within the Universal Dependencies framework), and leveraging machine learning models (like BERT and various parsing algorithms) for both automated annotation and analysis. These resources are crucial for advancing linguistic research, particularly in low-resource languages, and for improving the performance of natural language processing applications.

Papers