Graph Condensation
Graph condensation aims to create smaller, representative versions of large graphs to accelerate training of graph neural networks (GNNs) while preserving predictive performance. Current research focuses on developing efficient algorithms, often employing gradient matching or distribution alignment techniques, to generate these condensed graphs, with some exploring training-free approaches and addressing issues like robustness to noise and generalization to unseen data. This technique is significant because it addresses the computational bottleneck of training GNNs on massive datasets, enabling the application of GNNs to larger and more complex real-world problems in various domains.
Papers
September 29, 2024
July 10, 2024
July 3, 2024
June 30, 2024
June 24, 2024
June 19, 2024
May 27, 2024
May 23, 2024
May 22, 2024
May 7, 2024
March 22, 2024
March 12, 2024
February 7, 2024
February 5, 2024
February 3, 2024
January 29, 2024
January 22, 2024
January 18, 2024