Calendar06 October 2023

Publication: Residual Reservoir Computing Neural Networks for Time-series Classification Publication: Residual Reservoir Computing Neural Networks for Time-series Classification

Reservoir Computing (RC) networks are a class of recurrent neural models that have become extremely popular over the years due to their efficient training. Rather than applying end-to-end backpropagation through time training, RC exploits the properties of asymptotically stable recurrent layers to avoid the computational burden of training algorithms as much as possible.

In fact, the only trainable component in the architecture is a readout layer. In practice, the hidden recurrent layer of the architecture, called reservoir, remains untrained after random initialization, subject to a stability condition known as the echo state property. On the one hand, this property allows for stable dynamics and well-behaved state-space organization, which has been successfully exploited in various application scenarios, especially in pervasive AI environments.

On the other hand, the system is intrinsically biased towards fading memory computation, which inevitably reduces the ability of the network to effectively propagate the driving input information across multiple time steps.

In this work, EMERGE partners from the University of Pisa introduce a novel class of Reservoir Computing (RC) models, a family of efficiently trainable Recurrent Neural Networks based on untrained connections. Aiming to improve the forward propagation of input information through time, the authors augment standard Echo State Networks (ESNs) with linear reservoir-skip connections modulated by an untrained orthogonal weight matrix. They analyse the mathematical properties of the resulting reservoir systems and show that the dynamical regime of the proposed class of models is controllably close to the edge of stability. Experiments on several time-series classification tasks highlight the striking performance advantage of the proposed approach over standard ESNs.

Read the paper: https://doi.org/10.14428/esann/2023.ES2023-112

Access EMERGE publications in the link below.