Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks

Fernando Gama, Elvin Isufi, Geert Leus, Alejandro Ribeiro

Research output: Contribution to journalReview articleScientificpeer-review

45 Citations (Scopus)
34 Downloads (Pure)


Network data can be conveniently modeled as a graph signal, where data values are assigned to nodes of a graph that describes the underlying network topology. Successful learning from network data is built upon methods that effectively exploit this graph structure. In this article, we leverage graph signal processing (GSP) to characterize the representation space of graph neural networks (GNNs). We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology. These two properties offer insight about the workings of GNNs and help explain their scalability and transferability properties, which, coupled with their local and distributed nature, make GNNs powerful tools for learning in physical networks. We also introduce GNN extensions using edge-varying and autoregressive moving average (ARMA) graph filters and discuss their properties. Finally, we study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.

Original languageEnglish
Article number9244191
Pages (from-to)128-138
Number of pages11
JournalIEEE Signal Processing Magazine
Issue number6
Publication statusPublished - 2020

Bibliographical note

Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project

Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.


Dive into the research topics of 'Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks'. Together they form a unique fingerprint.

Cite this