The Minimax Lower Bound of Kernel Stein Discrepancy Estimation

arXiv stat.ML / 3/31/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proves that the minimax lower bound for kernel Stein discrepancy (KSD) estimation is $n^{-1/2}$, showing existing KSD estimators achieving known rates are optimally convergent.
  • It establishes these results for KSD estimation on $\mathbb R^d$ using the Langevin–Stein operator, including an explicit constant for the Gaussian kernel.
  • The explicit Gaussian-kernel analysis suggests the estimation difficulty can grow exponentially with dimension $d$.
  • It also extends to general domains, settling the minimax lower bound beyond Euclidean settings.
  • The work provides two complementary proof strategies that together characterize the statistical limits of KSD-based goodness-of-fit estimation.

Abstract

Kernel Stein discrepancies (KSDs) have emerged as a powerful tool for quantifying goodness-of-fit over the last decade, featuring numerous successful applications. To the best of our knowledge, all existing KSD estimators with known rate achieve \sqrt n-convergence. In this work, we present two complementary results (with different proof strategies), establishing that the minimax lower bound of KSD estimation is n^{-1/2} and settling the optimality of these estimators. Our first result focuses on KSD estimation on \mathbb R^d with the Langevin-Stein operator; our explicit constant for the Gaussian kernel indicates that the difficulty of KSD estimation may increase exponentially with the dimensionality d. Our second result settles the minimax lower bound for KSD estimation on general domains.