AI Navigate

Graph Tokenization for Bridging Graphs and Transformers

arXiv cs.AI / 3/13/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces a graph tokenization framework that converts graphs into sequential representations using reversible graph serialization combined with Byte Pair Encoding, enabling Transformers to process graphs without changes to their architecture.
  • The method guides serialization with global statistics of graph substructures so frequently occurring substructures are represented as tokens that BPE can merge, preserving structural information.
  • Empirical results show state-of-the-art performance on 14 graph benchmarks and frequent improvements over graph neural networks and graph transformers.
  • The approach bridges graph-structured data and sequence models, with the authors providing the code on GitHub for reproducibility.

Abstract

The success of large pretrained Transformers is closely tied to tokenizers, which convert raw input into discrete symbols. Extending these models to graph-structured data remains a significant challenge. In this work, we introduce a graph tokenization framework that generates sequential representations of graphs by combining reversible graph serialization, which preserves graph information, with Byte Pair Encoding (BPE), a widely adopted tokenizer in large language models (LLMs). To better capture structural information, the graph serialization process is guided by global statistics of graph substructures, ensuring that frequently occurring substructures appear more often in the sequence and can be merged by BPE into meaningful tokens. Empirical results demonstrate that the proposed tokenizer enables Transformers such as BERT to be directly applied to graph benchmarks without architectural modifications. The proposed approach achieves state-of-the-art results on 14 benchmark datasets and frequently outperforms both graph neural networks and specialized graph transformers. This work bridges the gap between graph-structured data and the ecosystem of sequence models. Our code is available at \href{https://github.com/BUPT-GAMMA/Graph-Tokenization-for-Bridging-Graphs-and-Transformers}{\color{blue}here}.