Abstract
Graph Neural Networks (GNNs) have recently become increasingly popular due to
their ability to learn complex systems of relations or interactions arising in
a broad spectrum of problems ranging from biology and particle physics to
social networks and recommendation systems. Despite the plethora of different
models for deep learning on graphs, few approaches have been proposed thus far
for dealing with graphs that present some sort of dynamic nature (e.g. evolving
features or connectivity over time). In this paper, we present Temporal Graph
Networks (TGNs), a generic, efficient framework for deep learning on dynamic
graphs represented as sequences of timed events. Thanks to a novel combination
of memory modules and graph-based operators, TGNs are able to significantly
outperform previous approaches being at the same time more computationally
efficient. We furthermore show that several previous models for learning on
dynamic graphs can be cast as specific instances of our framework. We perform a
detailed ablation study of different components of our framework and devise the
best configuration that achieves state-of-the-art performance on several
transductive and inductive prediction tasks for dynamic graphs.
Users
Please
log in to take part in the discussion (add own reviews or comments).