Message Passing
Message passing, a fundamental concept in graph neural networks (GNNs), involves iteratively exchanging information between nodes in a graph to learn node representations. Current research focuses on improving the efficiency and expressiveness of message passing, exploring techniques like dynamic hierarchy learning, optimized pooling operators, and incorporating information from large language models or random walks to capture long-range dependencies. These advancements are impacting diverse fields, including physics simulation, drug discovery, and recommendation systems, by enabling more accurate and efficient analysis of complex, graph-structured data.
100papers
Papers
May 21, 2025
Oversmoothing, "Oversquashing", Heterophily, Long-Range, and more: Demystifying Common Beliefs in Graph Machine Learning
Adrian Arnaiz-Rodriguez, Federico ErricaELLIS Alicante●NEC Laboratories EuropeBeyond Node Attention: Multi-Scale Harmonic Encoding for Feature-Wise Graph Message Passing
Longlong Li, Cunquan Qu, Guanghui WangShandong University
January 24, 2025
December 11, 2024
Hyperbolic Hypergraph Neural Networks for Multi-Relational Knowledge Hypergraph Representation
Mengfan Li, Xuanhua Shi, Chenqi Qiao, Teng Zhang, Hai JinEdge-Splitting MLP: Node Classification on Homophilic and Heterophilic Graphs without Message Passing
Matthias Kohn, Marcel Hoffmann, Ansgar ScherpMixture of Experts Meets Decoupled Message Passing: Towards General and Adaptive Node Classification
Xuanze Chen, Jiajun Zhou, Shanqing Yu, Qi Xuan
November 21, 2024
November 16, 2024
October 31, 2024
October 3, 2024
September 18, 2024