Full Graph
Full-graph Graph Neural Networks (GNNs) offer high accuracy in processing graph data but face significant memory limitations when scaling to large datasets. Current research focuses on developing memory-efficient training methods, including techniques like spanning subgraph training, feature-label constrained graph reduction, and various forms of model and data parallelism (e.g., pipelined parallelism, asynchronous communication). These advancements aim to improve the scalability and efficiency of full-graph GNN training, enabling their application to larger and more complex real-world networks in domains such as social network analysis and bioinformatics.
Papers
June 7, 2024
December 27, 2023
August 19, 2023
June 2, 2023
March 2, 2023
September 14, 2022