Observable Geometry of Singular Statistical Models

arXiv stat.ML / 4/3/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper tackles singular statistical models where multiple parameter values produce the same data distribution, causing non-identifiability and invalidating standard asymptotic theory.
  • It proposes a parameterization-invariant geometric framework using “observable charts,” which are collections of distribution functionals that can distinguish probability measures directly on the model space.
  • The authors formalize “observable completeness” and “observable order” to measure, respectively, the ability to detect identifiable directions and the strength of higher-order distinguishability under analytic perturbations.
  • The main theorem shows that observable order lower-bounds how quickly the Kullback–Leibler divergence shrinks along analytic paths, thereby linking intrinsic model geometry to statistical distinguishability.
  • The framework is demonstrated on reduced-rank regression and Gaussian mixture models, where it recovers classical behavior in regular cases and exposes singular degeneracies in singular settings.

Abstract

Singular statistical models arise whenever different parameter values induce the same distribution, leading to non-identifiability and a breakdown of classical asymptotic theory. While existing approaches analyze these phenomena in parameter space, the resulting descriptions depend heavily on parameterization and obscure the intrinsic statistical structure of the model. In this paper, we introduce an invariant framework based on \emph{observable charts}: collections of functionals of the data distribution that distinguish probability measures. These charts define local coordinate systems directly on the model space, independent of parameterization. We formalize \emph{observable completeness} as the ability of such charts to detect identifiable directions, and introduce \emph{observable order} to quantify higher-order distinguishability along analytic perturbations. Our main result establishes that, under mild regularity conditions, observable order provides a lower bound on the rate at which Kullback-Leibler divergence vanishes along analytic paths. This connects intrinsic geometric structure in model space to statistical distinguishability and recovers classical behavior in regular models while extending naturally to singular settings. We illustrate the framework in reduced-rank regression and Gaussian mixture models, where observable coordinates reveal both identifiable structure and singular degeneracies. These results suggest that observable charts provide a unified and parameterization-invariant language for studying singular models and offer a pathway toward intrinsic formulations of invariants such as learning coefficients.