25 April 2025
Recurrent neural networks (RNNs) are computational models regarded as dynamical systems. Modularity is a key ingredient of complex systems. Thus, the composition of RNN modules provides a simple paradigm for building complex computational models, with the potential to approach the human brain capability.
A necessary step toward the ambitious goal to unlock the expressive power of an RNN composed of smaller specialized and adaptive RNNs is to ensure the stability of the whole network as a dynamical system, without the burden of dealing with complex optimization schemes to foster stability in the training phase.
To achieve this objective, EMERGE partners from the University of Pisa propose two strategies to allow adaptivity of the internal weights of RNN modules while ensuring contractive dynamics from each. They test these new strategies on the popular permuted sequential MNIST (psMNIST) and sequential MNIST (sMNIST) tasks and compare them to Sparse Combo Net. Experiments demonstrate the effectiveness of their proposal, with the new strategies outperforming the original Sparse Combo Net (SCN) in all cases.
Read the paper in the link below.

