Abstract
While the mathematical foundations of score-based generative models are increasingly well understood for unconstrained Euclidean spaces, many practical applications involve data restricted to bounded domains. This paper provides a statistical analysis of reflected diffusion models on the hypercube [0,1]^D for target distributions supported on d-dimensional linear subspaces. A primary challenge in this setting is the absence of Gaussian transition kernels, which play a central role in standard theory in \mathbb{R}^D. By employing an easily implementable infinite series expansion of the transition densities, we develop analytic tools to bound the score function and its approximation by sparse ReLU networks. For target densities with Sobolev smoothness \alpha, we establish a convergence rate in the 1-Wasserstein distance of order n^{-\frac{\alpha+1-\delta}{2\alpha+d}} for arbitrarily small \delta > 0, demonstrating that the generative algorithm fully adapts to the intrinsic dimension d. These results confirm that the presence of reflecting boundaries does not degrade the fundamental statistical efficiency of the diffusion paradigm, matching the almost optimal rates known for unconstrained settings.