Generative Transformer Model BART
BART, a generative transformer model, is a powerful tool used across numerous natural language processing tasks, primarily focusing on text generation and understanding. Current research emphasizes improving BART's performance in areas such as causal inference (particularly with continuous treatments), error correction, and logical reasoning, often through modifications like incorporating symbolic reasoning or multi-granularity scene graphs. These advancements enhance BART's capabilities in diverse applications, including molecular representation learning, cross-lingual dialogue summarization, and even classical poetry style analysis, demonstrating its broad utility and impact on various fields.
Papers
October 11, 2024
September 10, 2024
March 25, 2024
March 17, 2024
October 21, 2023
September 26, 2023
September 13, 2023
November 29, 2022
November 9, 2022
October 12, 2022
July 8, 2022
June 25, 2022
April 15, 2022
February 28, 2022
February 11, 2022