Autoregressive Translation
Autoregressive translation (AT) sequentially generates translations, while non-autoregressive translation (NAT) predicts all words simultaneously, offering faster inference but typically lower accuracy. Current research focuses on bridging this accuracy gap by scaling NAT models, improving dependency modeling within the decoder, and exploring hybrid approaches that combine the strengths of both AT and NAT, such as generating a sparse sequence autoregressively and then filling in the gaps non-autoregressively. These advancements aim to create efficient and high-quality translation systems, impacting both the speed and cost-effectiveness of machine translation applications.
Papers
May 21, 2024
May 25, 2023
January 30, 2023
January 27, 2023
December 3, 2022
October 19, 2022
October 11, 2022
March 30, 2022