Message Passing
Message passing, a fundamental concept in graph neural networks (GNNs), involves iteratively exchanging information between nodes in a graph to learn node representations. Current research focuses on improving the efficiency and expressiveness of message passing, exploring techniques like dynamic hierarchy learning, optimized pooling operators, and incorporating information from large language models or random walks to capture long-range dependencies. These advancements are impacting diverse fields, including physics simulation, drug discovery, and recommendation systems, by enabling more accurate and efficient analysis of complex, graph-structured data.
Papers
Combining Graph Neural Network and Mamba to Capture Local and Global Tissue Spatial Relationships in Whole Slide Images
Ruiwen Ding, Kha-Dinh Luong, Erika Rodriguez, Ana Cristina Araujo Lemos da Silva, William Hsu
Learning Long Range Dependencies on Graphs via Random Walks
Dexiong Chen, Till Hendrik Schulz, Karsten Borgwardt