Paper ID: 2403.12529
Contextualized Messages Boost Graph Representations
Brian Godwin Lim, Galvin Brice Lim, Renzo Roel Tan, Kazushi Ikeda
Graph neural networks (GNNs) have gained significant attention in recent years for their ability to process data that may be represented as graphs. This success has prompted several studies to explore the representational capability of GNNs based on the graph isomorphism task. These works inherently assume a countable node feature representation, potentially limiting their applicability. Interestingly, only a few theoretical works study GNNs with uncountable node feature representation. This paper presents a novel perspective on the representational capability of GNNs across all levels - node-level, neighborhood-level, and graph-level - when the space of node feature representation is uncountable. Specifically, it relaxes the injective requirement in previous works by employing an implicit pseudometric distance on the space of input to create a soft-injective function. This allows distinct inputs to produce similar outputs only if the pseudometric deems the inputs to be sufficiently similar on some representation, which is often useful in practice. As a consequence, a novel soft-isomorphic relational graph convolution network (SIR-GCN) that emphasizes non-linear and contextualized transformation of neighborhood feature representations is proposed. A mathematical discussion on the relationship between SIR-GCN and widely used GNNs is then laid out to put the contribution in context, establishing SIR-GCN as a generalization of classical GNN methodologies. Experiments on synthetic and benchmark datasets demonstrate the relative superiority of SIR-GCN, outperforming comparable models in node and graph property prediction tasks.
Submitted: Mar 19, 2024