M. Pinna, A. Ceni and C. Gallicchio, "Residual Reservoir Memory Networks," 2025 International Joint Conference on Neural Networks (IJCNN), Rome, Italy, 2025, pp. 1-7, doi: 10.1109/IJCNN64981.2025.11227859.
Abstract: We introduce a novel class of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) paradigm, called Residual Reservoir Memory Networks (ResRMNs). ResRMN combines a linear memory reservoir with a non-linear reservoir, where the latter is based on residual orthogonal connections along the temporal dimension for enhanced long-term propagation of the input. The resulting reservoir state dynamics are studied through the lens of linear stability analysis, and we investigate diverse configurations for the temporal residual connections. The proposed approach is empirically assessed on time-series and pixel-level 1-D classification tasks. Our experimental results highlight the advantages of the proposed approach over other conventional RC models. Code is available at github.com/NennoMP/residualrmn
-
Next Publication
Non-Dissipative Graph Propagation for Non-Local Community Detection -
Previous Publication
SONAR: Long-Range Graph Propagation Through Information Waves - View All Publications

