RNN T Training

Training Recurrent Neural Networks (RNNs) remains a significant challenge due to inherent instability in gradient calculations, particularly for long sequences. Current research focuses on improving training stability and efficiency through techniques like gradient flossing (controlling Lyapunov exponents), alternative training objectives beyond maximum likelihood estimation (e.g., learning to search), and architectural modifications such as incorporating acoustic lookahead in RNN-Transducers for speech recognition. These advancements aim to enhance RNN performance in various applications, including machine translation, speech recognition, and time-series forecasting, particularly in low-resource settings where data scarcity is a major limitation.

Papers