Accelerating Constrained Sampling: A Large Deviations Approach

arXiv stat.ML / 4/7/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • 研究は、制約付きドメイン上で目標分布をサンプリングする問題に対し、反射ランジュバン系を拡張したSRNLD(skew-reflected non-reversible Langevin dynamics)の長時間挙動を大偏差原理(LDP)で解析する。
  • 境界での外向き単位法線ベクトル場とskew-symmetric行列の積がゼロとなるように行列を設計した場合、経験分布に対するレート関数を明示的に特徴づけ、RLDに比べて収束が加速されることを示す。
  • 同レート関数の解析から、提案したskew行列設計は漸近分散も減少させる(=推定の効率が上がる)ことが示される。
  • 数値実験では、上記の設計を用いたSRNLMC(skew-reflected non-reversible Langevin Monte Carlo)がRLD/従来手法より優れた性能を示し、LDPに基づく理論的結論を裏づけている。

Abstract

The problem of sampling a target probability distribution on a constrained domain arises in many applications including machine learning. For constrained sampling, various Langevin algorithms such as projected Langevin Monte Carlo (PLMC), based on the discretization of reflected Langevin dynamics (RLD) and more generally skew-reflected non-reversible Langevin Monte Carlo (SRNLMC), based on the discretization of skew-reflected non-reversible Langevin dynamics (SRNLD), have been proposed and studied in the literature. This work focuses on the long-time behavior of SRNLD, where a skew-symmetric matrix is added to RLD. Although acceleration for SRNLD has been studied, it is not clear how one should design the skew-symmetric matrix in the dynamics to achieve good performance in practice. We establish a large deviation principle (LDP) for the empirical measure of SRNLD when the skew-symmetric matrix is chosen such that its product with the outward unit normal vector field on the boundary is zero. By explicitly characterizing the rate functions, we show that this choice of the skew-symmetric matrix accelerates the convergence to the target distribution compared to RLD and reduces the asymptotic variance. Numerical experiments for SRNLMC based on the proposed skew-symmetric matrix show superior performance, which validate the theoretical findings from the large deviations theory.