Paper ID: 2401.04632

Hypercomplex neural network in time series forecasting of stock data

Radosław Kycia, Agnieszka Niemczynowicz

The goal of this paper is to test three classes of neural network (NN) architectures based on four-dimensional (4D) hypercomplex algebras for time series prediction. We evaluate different architectures, varying the input layers to include convolutional, Long Short-Term Memory (LSTM), or dense hypercomplex layers for 4D algebras. Four related Stock Market time series are used as input data, with the prediction focused on one of them. Hyperparameter optimization for each architecture class was conducted to compare the best-performing neural networks within each class. The results indicate that, in most cases, architectures with hypercomplex dense layers achieve similar Mean Absolute Error (MAE) accuracy compared to other architectures, but with significantly fewer trainable parameters. Consequently, hypercomplex neural networks demonstrate the ability to learn and process time series data faster than the other tested architectures. Additionally, it was found that the ordering of the input time series have a notable impact on effectiveness.

Submitted: Jan 9, 2024