Low Resource Machine Translation

Low-resource machine translation (LRMT) focuses on building accurate translation systems for language pairs with limited parallel training data. Current research emphasizes leveraging large language models (LLMs) for data augmentation, improving data quality through filtering and editing techniques, and exploring effective transfer learning strategies, including pivot-based and multilingual approaches. These advancements aim to bridge the translation gap for under-resourced languages, impacting fields like cross-cultural communication, information access, and multilingual NLP research.

Papers