The Diffusion-Attention Connection

arXiv cs.LG / 4/14/2026

📰 News

Key Points

  • The paper argues that transformers, diffusion maps, and magnetic Laplacians are not separate frameworks but different regimes of a unified Markov geometry constructed from pre-softmax query scores.

Abstract

Transformers, diffusion-maps, and magnetic Laplacians are usually treated as separate tools; we show they are all different regimes of a single Markov geometry built from pre-softmax query-scores. We define a QK "bidivergence" whose exponentiated and normalized forms yield attention, diffusion-maps, and magnetic diffusion. And use product of experts and Schr\"odinger-bridges to connect and organize them into equilibrium, nonequilibrium steady-state, and driven dynamics.