Multilingual Neural Machine Translation
Multilingual neural machine translation (MNMT) aims to build single models capable of translating between numerous language pairs, improving efficiency and resource allocation compared to training separate bilingual models. Current research focuses on optimizing model architectures (like Mixture-of-Experts) and training strategies to address challenges such as parameter inefficiency, negative interactions between languages during fine-tuning, and the "off-target" problem (incorrect language output). These advancements are significant because they enable more efficient and effective translation for low-resource languages and improve the overall quality and robustness of machine translation systems.
Papers
April 29, 2022
March 15, 2022
March 4, 2022
December 27, 2021