Paper ID: 2401.16176
A Survey on Structure-Preserving Graph Transformers
Van Thuy Hoang, O-Joun Lee
The transformer architecture has shown remarkable success in various domains, such as natural language processing and computer vision. When it comes to graph learning, transformers are required not only to capture the interactions between pairs of nodes but also to preserve graph structures connoting the underlying relations and proximity between them, showing the expressive power to capture different graph structures. Accordingly, various structure-preserving graph transformers have been proposed and widely used for various tasks, such as graph-level tasks in bioinformatics and chemoinformatics. However, strategies related to graph structure preservation have not been well organized and systematized in the literature. In this paper, we provide a comprehensive overview of structure-preserving graph transformers and generalize these methods from the perspective of their design objective. First, we divide strategies into four main groups: node feature modulation, context node sampling, graph rewriting, and transformer architecture improvements. We then further divide the strategies according to the coverage and goals of graph structure preservation. Furthermore, we also discuss challenges and future directions for graph transformer models to preserve the graph structure and understand the nature of graphs.
Submitted: Jan 29, 2024