Paper ID: 2210.09475
FIMP: Foundation Model-Informed Message Passing for Graph Neural Networks
Syed Asad Rizvi, Nhi Nguyen, Haoran Lyu, Benjamin Christensen, Josue Ortega Caro, Antonio H. O. Fonseca, Emanuele Zappala, Maryam Bagherian, Christopher Averill, Chadi G. Abdallah, Amin Karbasi, Rex Ying, Maria Brbic, Rahul Madhav Dhodapkar, David van Dijk
Foundation models have revolutionized the landscape of Deep Learning (DL), serving as a versatile platform which can be adapted to a wide range of downstream tasks. Despite their adaptability, applications of foundation models to downstream graph-based tasks have been limited, and there remains no convenient way to leverage large-scale non-graph pretrained models in graph-structured settings. In this work, we present a new framework which we term Foundation-Informed Message Passing (FIMP) to bridge the fields of foundational models and GNNs through a simple concept: constructing message-passing operators from pretrained foundation model weights. We show that this approach results in improved performance for graph-based tasks in a number of data domains, allowing graph neural networks to leverage the knowledge of foundation models.
Submitted: Oct 17, 2022