Sequence Representation
Sequence representation learning aims to capture meaningful patterns and relationships within ordered data, enabling effective modeling of diverse sequential phenomena. Current research emphasizes developing efficient and generalizable methods, focusing on architectures like Transformers and Graph Neural Networks, often incorporating contrastive learning and hierarchical approaches to improve representation quality and scalability. These advancements have significant implications for various applications, including speech recognition, recommendation systems, risk prediction, and natural language processing tasks like coreference resolution, by enabling more accurate and efficient modeling of sequential data.
Papers
October 27, 2024
October 17, 2024
October 15, 2024
September 27, 2024
July 22, 2024
June 5, 2024
May 30, 2024
March 14, 2024
October 20, 2023
October 15, 2023
October 12, 2023
August 22, 2023
April 5, 2023
February 7, 2023
November 15, 2022
August 10, 2022
July 20, 2022
July 14, 2022
June 17, 2022