Back Translation
Back translation, the process of translating text into a different language and then back to the original, is a data augmentation technique increasingly used to improve machine translation (MT) models, particularly for low-resource languages. Current research focuses on optimizing back-translation methods within various neural architectures like Transformers, exploring techniques such as multilingual transfer learning, and evaluating the impact of different back-translation strategies on model performance and robustness against adversarial attacks. This technique's significance lies in its ability to enhance MT capabilities for under-resourced languages and improve the reliability of AI-generated text detection systems, ultimately impacting both cross-cultural research and the development of more robust AI applications.
Papers
University of Cape Town's WMT22 System: Multilingual Machine Translation for Southern African Languages
Khalid N. Elmadani, Francois Meyer, Jan Buys
SIT at MixMT 2022: Fluent Translation Built on Giant Pre-trained Models
Abdul Rafae Khan, Hrishikesh Kanade, Girish Amar Budhrani, Preet Jhanglani, Jia Xu