03 February 2026
From sensor signals to language, sequential data presents demanding challenges: adapting to shifting dynamics, delivering stable predictions, and capturing long-term dependencies. Recurrent neural networks (RNNs) offer a powerful framework for modeling such temporal processes, yet they reveal a tension between full adaptivity and stability. Stability of the recurrent dynamics is essential for reliable learning and robust generalization, while adaptability to diverse and evolving input regimes remains a key requirement for real-world applications.
In this study, EMERGE partners from the University of Pisa introduce AdaDiag, a framework for constructing sparse assemblies of recurrent neural networks (RNNs) with formal stability guarantees. Thea author's approach builds upon contraction theory by designing RNN modules that are inherently contractive through adaptive diagonal parametrization and learnable characteristic time scales. This formulation enables each module to remain fully trainable while preserving global stability under skew-symmetric coupling. The authors provide rigorous theoretical analysis of contractivity, along with a complexity discussion showing that stability is achieved without additional computational burden. Experiments on ten heterogeneous time series benchmarks demonstrate that AdaDiag consistently surpasses SCN, LSTM, and Vanilla RNN baselines, and achieves competitive performance with state-of-the-art models, all while requiring substantially fewer trainable parameters. These results highlight the effectiveness of sparse and stable assemblies for efficient, adaptive, and generalizable sequence modeling.
Read the paper in the link below.

