Long Span
"Long span" research addresses the limitations of current models in processing and generating lengthy sequences of data, whether text, audio, or video. Current efforts focus on improving large language models (LLMs) and other deep learning architectures like transformers (including Longformer and variations) and LSTMs to handle longer contexts effectively, often employing techniques like coreference resolution, hierarchical attention, and efficient attention mechanisms. This research is crucial for advancing natural language processing, improving video and audio analysis, and enabling more sophisticated applications in diverse fields such as medical diagnosis, legal document processing, and personalized search.
Papers
October 25, 2022
October 21, 2022
October 11, 2022
October 7, 2022
August 14, 2022
August 7, 2022
July 29, 2022
April 5, 2022
March 30, 2022
March 21, 2022
March 8, 2022
February 16, 2022
January 27, 2022
January 18, 2022
December 2, 2021