Autoregressive Text Generation
Autoregressive text generation focuses on creating text sequentially, one unit at a time, conditioned on previously generated units. Current research emphasizes improving efficiency and controllability, exploring model architectures like transformers and incorporating techniques such as hierarchical skip decoding and tractable probabilistic models to manage constraints and enhance generation quality. These advancements are significant because they address limitations in speed and control, leading to more efficient and versatile applications in areas like machine translation, dialogue systems, and creative writing.
Papers
October 10, 2024
September 24, 2024
September 19, 2024
July 20, 2024
May 27, 2024
March 22, 2024
March 20, 2024
October 18, 2023
October 17, 2023
June 4, 2023
May 29, 2023
May 6, 2023
April 24, 2023
April 15, 2023
March 12, 2023
January 27, 2023
October 24, 2022
August 26, 2022