Paper ID: 2410.12852
The Large Language Model GreekLegalRoBERTa
Vasileios Saketos, Despina-Athanasia Pantazi, Manolis Koubarakis
We develop four versions of GreekLegalRoBERTa, which are four large language models trained on Greek legal and nonlegal text. We show that our models surpass the performance of GreekLegalBERT, Greek- LegalBERT-v2, and GreekBERT in two tasks involving Greek legal documents: named entity recognition and multi-class legal topic classification. We view our work as a contribution to the study of domain-specific NLP tasks in low-resource languages, like Greek, using modern NLP techniques and methodologies.
Submitted: Oct 10, 2024