Calendar07 September 2024

Publication: Residual Echo State Networks: Residual recurrent neural networks with stable dynamics and fast learning Publication: Residual Echo State Networks: Residual recurrent neural networks with stable dynamics and fast learning

Residual networks (ResNets) are a type of deep neural network that uses skip connections which allow the network to learn the residual mapping between the input and output of a layer, instead of the direct mapping. This makes the network easier to optimise and enables the training of very deep architectures. Residual networks have achieved state-of-the-art results in various computer vision tasks. Besides they are widely adopted to construct the backbone of large language models like Transformers. More advanced architectures involving residual connections have been proposed recently in other domains of application, leveraging on the fusion of dynamical systems theory and neural computation concepts. Despite their success in feedforward neural network architectures, little has been explored for residual connections in the context of Recurrent Neural Networks (RNNs).

In this work, EMERGE partners from the University of Pisa study the architectural bias of residual connections in the context of recurrent neural networks (RNNs), specifically in the temporal dimension. The authors frame their discussion from the perspective of Reservoir Computing and dynamical system theory, focusing on important aspects of neural computation like memory capacity, long-term information processing, stability, and nonlinear computation capability. Experiments corroborate the striking advantage brought by temporal residual connections for a plethora of different time series processing tasks, comprehending memory-based, forecasting, and classification problems.

Read the paper in the link below.