Sequence to Sequence Task
Sequence-to-sequence tasks involve mapping input sequences to output sequences, a fundamental problem in areas like machine translation and text summarization. Current research focuses on improving efficiency and accuracy using various architectures, including Transformers, RNN-Transducers, and novel approaches like diffusion models and attention-based methods, often incorporating techniques like knowledge distillation and parameter-efficient fine-tuning. These advancements aim to enhance model performance, particularly for long sequences and resource-constrained environments, leading to improved applications in natural language processing and other fields requiring sequential data processing.
Papers
October 16, 2024
June 25, 2024
May 20, 2024
March 28, 2024
November 17, 2023
September 28, 2023
September 25, 2023
September 20, 2023
September 12, 2023
May 29, 2023
May 26, 2023
May 21, 2023
May 15, 2023
May 9, 2023
April 13, 2023
April 1, 2023
February 7, 2023
February 1, 2023
December 29, 2022