Hierarchical Context
Hierarchical context modeling aims to improve the processing and understanding of information by leveraging relationships across multiple scales or levels of granularity. Current research focuses on developing methods to efficiently handle long sequences of data, such as in natural language processing and speech synthesis, often employing hierarchical architectures and algorithms like ADMM or contrastive learning to capture both local and global context. These advancements are significant for improving the performance of large language models, enhancing speech synthesis, and enabling more robust applications in areas like image translation and semantic segmentation.
Papers
August 31, 2024
July 23, 2024
April 16, 2024
April 10, 2024
March 5, 2024
September 10, 2023
July 29, 2023
April 24, 2023
March 13, 2023
June 22, 2022
April 6, 2022
March 23, 2022