Graph Energy Matching: Transport-Aligned Energy-Based Modeling for Graph Generation

arXiv stat.ML / 3/25/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces Graph Energy Matching (GEM), an energy-based generative framework for graphs that models relative likelihoods to enable composable inference tasks like conditional generation and constraint enforcement.
  • GEM addresses a key weakness of discrete energy-based models—inefficient or unstable sampling caused by spurious local minima in off-support regions—thereby narrowing the historical fidelity gap versus discrete diffusion approaches.
  • The method is motivated by a transport-map optimization view of the JKO scheme, learning a permutation-invariant potential energy that both guides samples from noise toward data and refines them in high-likelihood regions.
  • A new sampling protocol uses an energy-based switch to transition from rapid gradient-guided transport into a mixing regime for broader exploration of the learned graph distribution.
  • Experiments on molecular graph benchmarks show GEM matching or exceeding strong discrete diffusion baselines and support inference-time capabilities such as property-constrained sampling and geodesic interpolation between graphs.

Abstract

Energy-based models for discrete domains, such as graphs, explicitly capture relative likelihoods, naturally enabling composable probabilistic inference tasks like conditional generation or enforcing constraints at test-time. However, discrete energy-based models typically struggle with efficient and high-quality sampling, as off-support regions often contain spurious local minima, trapping samplers and causing training instabilities. This has historically resulted in a fidelity gap relative to discrete diffusion models. We introduce Graph Energy Matching (GEM), a generative framework for graphs that closes this fidelity gap. Motivated by the transport map optimization perspective of the Jordan-Kinderlehrer-Otto (JKO) scheme, GEM learns a permutation-invariant potential energy that simultaneously provides transport-aligned guidance from noise toward data and refines samples within regions of high data likelihood. Further, we introduce a sampling protocol that leverages an energy-based switch to seamlessly bridge: (i) rapid, gradient-guided transport toward high-probability regions to (ii) a mixing regime for exploration of the learned graph distribution. On molecular graph benchmarks, GEM matches or exceeds strong discrete diffusion baselines. Beyond sample quality, explicit modeling of relative likelihood enables targeted exploration at inference time, facilitating compositional generation, property-constrained sampling, and geodesic interpolation between graphs.