Full Graph

Full-graph Graph Neural Networks (GNNs) offer high accuracy in processing graph data but face significant memory limitations when scaling to large datasets. Current research focuses on developing memory-efficient training methods, including techniques like spanning subgraph training, feature-label constrained graph reduction, and various forms of model and data parallelism (e.g., pipelined parallelism, asynchronous communication). These advancements aim to improve the scalability and efficiency of full-graph GNN training, enabling their application to larger and more complex real-world networks in domains such as social network analysis and bioinformatics.

Papers