Graph Hypernetworks
Graph hypernetworks (GHNs) combine graph neural networks and hypernetworks to predict the parameters of diverse neural network architectures, aiming to improve efficiency and reduce the computational cost of training large models. Current research focuses on addressing limitations like over-smoothing and heterophily, developing more efficient architectures such as low-rank GHNs, and applying GHNs to various tasks including federated learning, model initialization, and quantization-aware training. This approach holds significant promise for accelerating deep learning research and development by enabling efficient training and deployment of large models across diverse applications and hardware constraints.
Papers
May 31, 2024
May 25, 2024
February 26, 2024
October 25, 2023
September 28, 2023
September 24, 2023
September 30, 2022
August 26, 2022
July 20, 2022