There is a lot of expository materials on Graph Neural Networks (GNNs) out there. I want here to focus on a short “mathematical” introduction to it, in the sense to quickly arrive to the concept of equivariance and invariance to permutations, and how GNNs are designed to deal with it, in order to discuss in a future post our work (??, ????) on the convergence of GNNs to some “continuous” limit when the number of nodes goes to infinity.
Relational data represent relationships between entities anywhere on the web (e.g. online social networks) or in the physical world (e.g. structure of the protein).
Graph neural networks are intimately related to partial differential equations governing information diffusion on graphs. Thinking of GNNs as PDEs leads to a new broad class of graph ML methods.
P. Heim, J. Ziegler, and S. Lohmann. Proceedings of the International Workshop on Interacting with Multimedia Content in the Social Semantic Web (IMC-SSW 2008), volume 417 of CEUR Workshop Proceedings, page 49--58. Aachen, (2008)
L. Xia, Y. Xu, C. Huang, P. Dai, and L. Bo. Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, page 757–766. New York, NY, USA, Association for Computing Machinery, (2021)
Y. Yang, C. Huang, L. Xia, and C. Li. Proceedings of the 45th international ACM SIGIR conference on research and development in information retrieval, page 1434--1443. (2022)
J. Zhang, Y. Dong, Y. Wang, J. Tang, and M. Ding. Proceedings of the 28th International Joint Conference on Artificial Intelligence, page 4278–4284. AAAI Press, (Aug 10, 2019)