GNN Training
Training Graph Neural Networks (GNNs) efficiently on massive graphs is a critical challenge driving current research. Efforts focus on optimizing various stages of the training process, including subgraph sampling strategies, efficient data I/O and memory management (especially leveraging CPU-GPU heterogeneous environments and persistent memory), and communication reduction techniques for distributed training. These advancements aim to improve both the speed and scalability of GNN training, enabling the application of these powerful models to increasingly larger and more complex datasets in diverse fields like recommendation systems and knowledge graph reasoning.
Papers
November 6, 2024
November 2, 2024
October 9, 2024
October 7, 2024
September 23, 2024
August 23, 2024
August 21, 2024
June 25, 2024
June 20, 2024
May 10, 2024
May 8, 2024
April 2, 2024
March 9, 2024
November 22, 2023
October 31, 2023
August 25, 2023
August 23, 2023
August 19, 2023
August 6, 2023