11 October 2024


Reservoir Computing (RC) represents a powerful paradigm in the design of Recurrent Neural Networks (RNNs), widely recognized for its efficiency and reduced training requirements. This approach is particularly relevant in the realms of pervasive artificial intelligence (AI) and neuromorphic hardware implementations, where RC facilitates low-power, high-speed processing, aligning with the goals of sustainable AI.
In this work, EMERGE partners from the University of Pisa introduce Reservoir Memory Networks (RMNs), a novel class of Reservoir Computing (RC) models that integrate a linear memory cell with a non-linear reservoir to enhance long-term information retention. The authors explore various configurations of the memory cell using orthogonal circular shift matrices and Legendre polynomials, alongside non-linear reservoirs configured as in Echo State Networks and Euler State Networks. Experimental results demonstrate the substantial benefits of RMNs in time-series classification tasks, highlighting their potential for advancing RC applications in areas requiring robust temporal processing.
Read the paper in the link below.