Temporal Dependency
Temporal dependency, the relationship between events or data points across time, is a central challenge in numerous fields, with research focusing on accurately modeling and predicting these dependencies for improved forecasting and decision-making. Current research emphasizes the development of advanced neural network architectures, including transformers, state space models (like Mamba), and recurrent neural networks, often combined with graph neural networks to capture both temporal and spatial relationships in complex datasets. These advancements have significant implications for diverse applications, such as traffic forecasting, financial modeling, medical image analysis, and autonomous driving, by enabling more accurate predictions and a deeper understanding of dynamic systems.
Papers
SpeakerBeam-SS: Real-time Target Speaker Extraction with Lightweight Conv-TasNet and State Space Modeling
Hiroshi Sato, Takafumi Moriya, Masato Mimura, Shota Horiguchi, Tsubasa Ochiai, Takanori Ashihara, Atsushi Ando, Kentaro Shinayama, Marc Delcroix
Online Learning of Temporal Dependencies for Sustainable Foraging Problem
John Payne, Aishwaryaprajna, Peter R. Lewis