Paper ID: 2410.02438
Learning K-U-Net with constant complexity: An Application to time series forecasting
Jiang You, Arben Cela, René Natowicz, Jacob Ouanounou, Patrick Siarry
Training deep models for time series forecasting is a critical task with an inherent challenge of time complexity. While current methods generally ensure linear time complexity, our observations on temporal redundancy show that high-level features are learned 98.44\% slower than low-level features. To address this issue, we introduce a new exponentially weighted stochastic gradient descent algorithm designed to achieve constant time complexity in deep learning models. We prove that the theoretical complexity of this learning method is constant. Evaluation of this method on Kernel U-Net (K-U-Net) on synthetic datasets shows a significant reduction in complexity while improving the accuracy of the test set.
Submitted: Oct 3, 2024