Andrea Ceni, Claudio Gallicchio, Residual Echo State Networks: Residual recurrent neural networks with stable dynamics and fast learning, Neurocomputing, 597, 2024, 127966, doi: 10.1016/j.neucom.2024.127966.

Abstract: Residual connections have been established as a staple for modern deep learning architectures. Most of their applications are cast towards feedforward computing. In this paper, we study the architectural bias of residual connections in the context of recurrent neural networks (RNNs), specifically in the temporal dimension. We frame our discussion from the perspective of Reservoir Computing and dynamical system theory, focusing on important aspects of neural computation like memory capacity, long-term information processing, stability, and nonlinear computation capability. Experiments corroborate the striking advantage brought by temporal residual connections for a plethora of different time series processing tasks, comprehending memory-based, forecasting, and classification problems.