GNN Training Framework
Training Graph Neural Networks (GNNs) on massive graphs presents significant computational challenges, driving research into efficient distributed training frameworks. Current efforts focus on minimizing communication overhead through techniques like asynchronous updates, one-bit quantization, and novel graph partitioning strategies (e.g., vertex cut), alongside methods to mitigate the staleness of distributed information. These advancements aim to improve both the speed and scalability of GNN training, enabling the application of these powerful models to increasingly large and complex datasets in various domains, such as social network analysis and drug discovery.
Papers
November 12, 2023
August 25, 2023
August 6, 2023
June 1, 2023
March 2, 2023
January 1, 2023
December 11, 2022
July 25, 2022