Beyond Spectral Clustering: Probabilistic Cuts for Differentiable Graph Partitioning

arXiv stat.ML / 4/2/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes a unified probabilistic framework for differentiable graph partitioning that generalizes probabilistic relaxations beyond prior RatioCut-focused approaches.
  • It provides analytic, tight upper bounds connecting expected relaxed objectives to expected discrete graph cuts, using integral representations and Gauss hypergeometric functions.
  • The framework yields numerically stable, closed-form forward and backward gradients, enabling end-to-end and online learning without eigendecompositions.
  • It covers a broad class of cuts, including Normalized Cut, offering more general guarantees and more principled gradients than earlier methods.
  • The results aim to provide a rigorous foundation for scalable differentiable clustering and contrastive learning objectives on graphs.

Abstract

Probabilistic relaxations of graph cuts offer a differentiable alternative to spectral clustering, enabling end-to-end and online learning without eigendecompositions, yet prior work centered on RatioCut and lacked general guarantees and principled gradients. We present a unified probabilistic framework that covers a wide class of cuts, including Normalized Cut. Our framework provides tight analytic upper bounds on expected discrete cuts via integral representations and Gauss hypergeometric functions with closed-form forward and backward. Together, these results deliver a rigorous, numerically stable foundation for scalable, differentiable graph partitioning covering a wide range of clustering and contrastive learning objectives.