Graph Condensation
Graph condensation aims to create smaller, representative versions of large graphs to accelerate training of graph neural networks (GNNs) while preserving predictive performance. Current research focuses on developing efficient algorithms, often employing gradient matching or distribution alignment techniques, to generate these condensed graphs, with some exploring training-free approaches and addressing issues like robustness to noise and generalization to unseen data. This technique is significant because it addresses the computational bottleneck of training GNNs on massive datasets, enabling the application of GNNs to larger and more complex real-world problems in various domains.
Papers
November 27, 2023
October 17, 2023
September 18, 2023
July 29, 2023
June 5, 2023
June 28, 2022