Low Resource Machine Translation
Low-resource machine translation (LRMT) focuses on building accurate translation systems for language pairs with limited parallel training data. Current research emphasizes leveraging large language models (LLMs) for data augmentation, improving data quality through filtering and editing techniques, and exploring effective transfer learning strategies, including pivot-based and multilingual approaches. These advancements aim to bridge the translation gap for under-resourced languages, impacting fields like cross-cultural communication, information access, and multilingual NLP research.
Papers
October 23, 2024
October 14, 2024
September 4, 2024
April 22, 2024
February 7, 2024
December 13, 2023
August 8, 2023
June 12, 2023
May 27, 2023
May 3, 2023
February 15, 2023
October 31, 2022
October 21, 2022
September 13, 2022
May 17, 2022
May 9, 2022
March 17, 2022
January 20, 2022