Pre Trained Neural Machine Translation
Pre-trained neural machine translation (NMT) leverages large language models to improve translation accuracy and efficiency, focusing on adapting these models to specific domains and languages. Current research emphasizes techniques like incorporating relative positional embeddings for better long-range dependency handling, efficient k-nearest-neighbor search for domain adaptation, and methods to enhance translation of named entities and handle noisy data, including multimodal approaches. These advancements are significant because they improve the quality and speed of machine translation, particularly in specialized fields and low-resource languages, leading to broader applications in various sectors.
Papers
August 21, 2024
April 8, 2024
January 31, 2024
October 18, 2023
August 30, 2023
July 21, 2023
May 22, 2023
February 23, 2023
December 19, 2022
October 26, 2022
April 20, 2022
February 17, 2022