Multilingual Translation

Multilingual translation aims to build systems capable of translating between numerous language pairs, often including low-resource languages, improving accessibility to information and communication globally. Current research focuses on leveraging large language models (LLMs), often fine-tuned with multilingual data and incorporating techniques like contrastive learning and selective parameter updates to mitigate catastrophic forgetting and improve zero-shot translation capabilities. These advancements are significant for both scientific understanding of cross-lingual representation and practical applications, such as enhancing access to information and facilitating cross-cultural communication.

Papers