Neural Machine Translation
Neural Machine Translation (NMT) aims to automatically translate text between languages using deep learning models, primarily focusing on improving translation accuracy and fluency. Current research emphasizes enhancing model robustness through techniques like contrastive learning to reduce repetition, leveraging translation memories and large language models (LLMs) for improved accuracy and efficiency, and addressing issues such as data scarcity in low-resource languages via data augmentation and transfer learning. These advancements have significant implications for cross-lingual communication, impacting fields ranging from international commerce to multilingual education and accessibility.
Papers
Joint-training on Symbiosis Networks for Deep Nueral Machine Translation models
Zhengzhe Yu, Jiaxin Guo, Minghan Wang, Daimeng Wei, Hengchao Shang, Zongyao Li, Zhanglin Wu, Yuxia Wang, Yimeng Chen, Chang Su, Min Zhang, Lizhi Lei, shimin tao, Hao Yang
Diformer: Directional Transformer for Neural Machine Translation
Minghan Wang, Jiaxin Guo, Yuxia Wang, Daimeng Wei, Hengchao Shang, Chang Su, Yimeng Chen, Yinglu Li, Min Zhang, Shimin Tao, Hao Yang
Characterizing and addressing the issue of oversmoothing in neural autoregressive sequence modeling
Ilia Kulikov, Maksim Eremeev, Kyunghyun Cho
Isometric MT: Neural Machine Translation for Automatic Dubbing
Surafel M. Lakew, Yogesh Virkar, Prashant Mathur, Marcello Federico
Amortized Noisy Channel Neural Machine Translation
Richard Yuanzhe Pang, He He, Kyunghyun Cho
Isochrony-Aware Neural Machine Translation for Automatic Dubbing
Derek Tam, Surafel M. Lakew, Yogesh Virkar, Prashant Mathur, Marcello Federico