Paper ID: 2312.15974
About rescaling, discretisation and linearisation of $\mathtt{RNN}$
Mariano Caruso, Cecilia Jarne
We explored the mathematical foundations of Recurrent Neural Networks ($\mathtt{RNN}$s) and three fundamental procedures: temporal rescaling, discretisation and linearisation. These techniques provide essential tools for characterizing $\mathtt{RNN}$s behaviour, enabling insights into temporal dynamics, practical computational implementation, and linear approximations for analysis. We discuss the flexible order of application of these procedures, emphasizing their significance in modelling and analyzing $\mathtt{RNN}$s for neuroscience and machine learning applications. We explicitly describe here under what conditions these procedures can be interchangeable.
Submitted: Dec 26, 2023