How Embeddings Shape Graph Neural Networks: Classical vs Quantum-Oriented Node Representations

arXiv cs.LG / 4/17/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper benchmarks how different node embedding choices affect graph neural network (GNN) performance for graph classification, addressing prior studies that compared mismatched experimental setups.
  • It evaluates classical baselines against quantum-oriented node representations, including circuit-defined variational embeddings and quantum-inspired embeddings derived from graph operators and linear-algebraic constructions.
  • Using a single unified pipeline with identical backbone, stratified splits, and matched optimization/early stopping, the study isolates embedding choice as the main variable.
  • Results show strong dataset dependence: quantum-oriented embeddings provide more consistent benefits on structure-driven benchmarks, while social graphs with limited node attributes often perform best with classical embeddings.
  • The work provides practical guidance on trade-offs among inductive bias, trainability, and stability when selecting quantum-oriented embeddings for graph learning under a fixed training budget.

Abstract

Node embeddings act as the information interface for graph neural networks, yet their empirical impact is often reported under mismatched backbones, splits, and training budgets. This paper provides a controlled benchmark of embedding choices for graph classification, comparing classical baselines with quantum-oriented node representations under a unified pipeline. We evaluate two classical baselines alongside quantum-oriented alternatives, including a circuit-defined variational embedding and quantum-inspired embeddings computed via graph operators and linear-algebraic constructions. All variants are trained and tested with the same backbone, stratified splits, identical optimization and early stopping, and consistent metrics. Experiments on five different TU datasets and on QM9 converted to classification via target binning show clear dataset dependence: quantum-oriented embeddings yield the most consistent gains on structure-driven benchmarks, while social graphs with limited node attributes remain well served by classical baselines. The study highlights practical trade-offs between inductive bias, trainability, and stability under a fixed training budget, and offers a reproducible reference point for selecting quantum-oriented embeddings in graph learning.