Message Passing
Message passing, a fundamental concept in graph neural networks (GNNs), involves iteratively exchanging information between nodes in a graph to learn node representations. Current research focuses on improving the efficiency and expressiveness of message passing, exploring techniques like dynamic hierarchy learning, optimized pooling operators, and incorporating information from large language models or random walks to capture long-range dependencies. These advancements are impacting diverse fields, including physics simulation, drug discovery, and recommendation systems, by enabling more accurate and efficient analysis of complex, graph-structured data.
Papers
Hyperbolic Hypergraph Neural Networks for Multi-Relational Knowledge Hypergraph Representation
Mengfan Li, Xuanhua Shi, Chenqi Qiao, Teng Zhang, Hai Jin
Edge-Splitting MLP: Node Classification on Homophilic and Heterophilic Graphs without Message Passing
Matthias Kohn, Marcel Hoffmann, Ansgar Scherp
Mixture of Experts Meets Decoupled Message Passing: Towards General and Adaptive Node Classification
Xuanze Chen, Jiajun Zhou, Shanqing Yu, Qi Xuan