Back Translation
Back translation, the process of translating text into a different language and then back to the original, is a data augmentation technique increasingly used to improve machine translation (MT) models, particularly for low-resource languages. Current research focuses on optimizing back-translation methods within various neural architectures like Transformers, exploring techniques such as multilingual transfer learning, and evaluating the impact of different back-translation strategies on model performance and robustness against adversarial attacks. This technique's significance lies in its ability to enhance MT capabilities for under-resourced languages and improve the reliability of AI-generated text detection systems, ultimately impacting both cross-cultural research and the development of more robust AI applications.
Papers
Multilingual Transfer and Domain Adaptation for Low-Resource Languages of Spain
Yuanchang Luo, Zhanglin Wu, Daimeng Wei, Hengchao Shang, Zongyao Li, Jiaxin Guo, Zhiqiang Rao, Shaojun Li, Jinlong Yang, Yuhao Xie, Jiawei Zheng Bin Wei, Hao Yang
Machine Translation Advancements of Low-Resource Indian Languages by Transfer Learning
Bin Wei, Jiawei Zhen, Zongyao Li, Zhanglin Wu, Daimeng Wei, Jiaxin Guo, Zhiqiang Rao, Shaojun Li, Yuanchang Luo, Hengchao Shang, Jinlong Yang, Yuhao Xie, Hao Yang
Better Alignment with Instruction Back-and-Forth Translation
Thao Nguyen, Jeffrey Li, Sewoong Oh, Ludwig Schmidt, Jason Weston, Luke Zettlemoyer, Xian Li
Simplifying Translations for Children: Iterative Simplification Considering Age of Acquisition with LLMs
Masashi Oshika, Makoto Morishita, Tsutomu Hirao, Ryohei Sasano, Koichi Takeda