22 September 2023
Recent works have proven the feasibility of fast and accurate time series classification methods based on randomized convolutional kernels. Concerning graph-structured data, the majority of randomized graph neural networks are based on the Echo State Network paradigm in which single layers or the whole network present some form of recurrence.
In this work, EMERGE partners from the University of Pisa and collaborators explore a simple form of a randomized graph neural network. The authors implement a no-frills graph convolutional network and leave its weights untrained. Then, they aggregate the node representations with global pooling operators, obtaining an untrained graph-level representation. Since there is no training involved, computing such representation is extremely fast.
They then apply a fast linear classifier to the obtained representations. They show that such a simple approach can obtain competitive predictive performance while being extremely efficient both at training and inference time. The group also studies when over-parameterization, namely generating more features than the ones necessary to interpolate, may be beneficial for the generalization abilities of the resulting models. Exploiting the algorithmic stability framework and based on empirical evidences from the considered graph datasets, they will shed some light on the over-parameterization setting.
Read the paper in the link below.

