Andrea Ceni, Valerio De Caro, Davide Bacciu, Claudio Gallicchio, Sparse assemblies of recurrent neural networks with stability guarantees, Neurocomputing, v. 675, 2026, p. 132952, DOI: 10.1016/j.neucom.2026.132952.

Abstract: We introduce AdaDiag, a framework for constructing sparse assemblies of recurrent neural networks (RNNs) with formal stability guarantees. Our approach builds upon contraction theory by designing RNN modules that are inherently contractive through adaptive diagonal parametrization and learnable characteristic time scales. This formulation enables each module to remain fully trainable while preserving global stability under skew-symmetric coupling. We provide rigorous theoretical analysis of contractivity, along with a complexity discussion showing that stability is achieved without additional computational burden. Experiments on ten heterogeneous time series benchmarks demonstrate that AdaDiag consistently surpasses SCN, LSTM, and Vanilla RNN baselines, and achieves competitive performance with state-of-the-art models, all while requiring substantially fewer trainable parameters. These results highlight the effectiveness of sparse and stable assemblies for efficient, adaptive, and generalizable sequence modeling.