Koopman Subspace Pruning in Reproducing Kernel Hilbert Spaces via Principal Vectors

arXiv stat.ML / 4/3/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses how finite-dimensional approximations of the Koopman operator depend on the invariance of the selected subspace, and proposes “subspace pruning” to remove misaligned directions to improve invariance proximity.
  • It reframes the pruning objective in RKHS (Reproducing Kernel Hilbert Space) geometry by defining and computing principal angles/vectors between a subspace and its Koopman image.
  • The authors provide an exact computational routine for principal angles/vectors and then scale it to large datasets using randomized Nyström approximations.
  • They introduce two algorithms—Kernel-SPV and Approximate Kernel-SPV—for targeted subspace refinement using principal vectors, with simulation results supporting the approach.

Abstract

Data-driven approximations of the infinite-dimensional Koopman operator rely on finite-dimensional projections, where the predictive accuracy of the resulting models hinges heavily on the invariance of the chosen subspace. Subspace pruning systematically discards geometrically misaligned directions to enhance this invariance proximity, which formally corresponds to the largest principal angle between the subspace and its image under the operator. Yet, existing techniques are largely restricted to Euclidean settings. To bridge this gap, this paper presents an approach for computing principal angles and vectors to enable Koopman subspace pruning within a Reproducing Kernel Hilbert Space (RKHS) geometry. We first outline an exact computational routine, which is subsequently scaled for large datasets using randomized Nystrom approximations. Based on these foundations, we introduce the Kernel-SPV and Approximate Kernel-SPV algorithms for targeted subspace refinement via principal vectors. Simulation results validate our approach.