Geometric Decoupling: Diagnosing the Structural Instability of Latent

arXiv cs.AI / 4/22/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper argues that Latent Diffusion Models (LDMs) are prone to “latent space brittleness,” which shows up as discontinuous semantic jumps during image editing.
  • It introduces a Riemannian geometric diagnostic method that studies the generative Jacobian and separates effects into Local Scaling (model capacity) and Local Complexity (curvature).
  • The authors find a “Geometric Decoupling” phenomenon: in out-of-distribution (OOD) generation, extreme curvature is consumed by unstable semantic boundaries rather than producing perceptible image detail.
  • They identify “Geometric Hotspots” as the structural cause of instability and propose an intrinsic geometric metric to more robustly assess generative reliability.

Abstract

Latent Diffusion Models (LDMs) achieve high-fidelity synthesis but suffer from latent space brittleness, causing discontinuous semantic jumps during editing. We introduce a Riemannian framework to diagnose this instability by analyzing the generative Jacobian, decomposing geometry into \textit{Local Scaling} (capacity) and \textit{Local Complexity} (curvature). Our study uncovers a \textbf{``Geometric Decoupling"}: while curvature in normal generation functionally encodes image detail, OOD generation exhibits a functional decoupling where extreme curvature is wasted on unstable semantic boundaries rather than perceptible details. This geometric misallocation identifies ``Geometric Hotspots" as the structural root of instability, providing a robust intrinsic metric for diagnosing generative reliability.