Paper ID: 2409.05577
Approximation Bounds for Recurrent Neural Networks with Application to Regression
Yuling Jiao, Yang Wang, Bokai Yan
We study the approximation capacity of deep ReLU recurrent neural networks (RNNs) and explore the convergence properties of nonparametric least squares regression using RNNs. We derive upper bounds on the approximation error of RNNs for Hölder smooth functions, in the sense that the output at each time step of an RNN can approximate a Hölder function that depends only on past and current information, termed a past-dependent function. This allows a carefully constructed RNN to simultaneously approximate a sequence of past-dependent Hölder functions. We apply these approximation results to derive non-asymptotic upper bounds for the prediction error of the empirical risk minimizer in regression problem. Our error bounds achieve minimax optimal rate under both exponentially $\beta$-mixing and i.i.d. data assumptions, improving upon existing ones. Our results provide statistical guarantees on the performance of RNNs.
Submitted: Sep 9, 2024