Latin Text

Research on Latin text focuses on developing computational methods for analyzing and processing this historical language, primarily to improve access to and understanding of vast historical corpora. Current efforts leverage deep learning models, such as Transformers (including BERT and RoBERTa variations), and other neural network architectures for tasks like handwriting recognition, morphological analysis, and cross-lingual semantic comparison with Greek and modern languages. These advancements facilitate historical linguistic research, improve the accessibility of historical documents, and offer new tools for digital humanities scholarship.

Papers