Subgraph Level Training
Subgraph-level training focuses on improving the efficiency and effectiveness of graph neural network (GNN) training by leveraging subgraphs—smaller, manageable portions of a larger graph. Current research emphasizes efficient subgraph selection and partitioning strategies, often incorporating contrastive learning or knowledge distillation to enhance model performance and reduce memory consumption. This approach addresses the limitations of training GNNs on massive graphs, impacting various applications including knowledge graph completion, recommender systems, and spatiotemporal forecasting by enabling scalability and improved accuracy. Furthermore, subgraph-level training is being explored for handling out-of-distribution data and improving few-shot learning scenarios.