Abstract
In finite-width deep neural networks, the empirical kernel G evolves stochastically across layers. We develop a collective kernel effective field theory (EFT) for pre-activation ResNets based on a G-only closure hierarchy and diagnose its finite validity window. Exploiting the exact conditional Gaussianity of residual increments, we derive an exact stochastic recursion for G. Applying Gaussian approximations systematically yields a continuous-depth ODE system for the mean kernel K_0, the kernel covariance V_4, and the 1/n mean correction K_{1,\mathrm{EFT}}, which emerges diagrammatically as a one-loop tadpole correction. Numerically, K_0 remains accurate at all depths. However, the V_4 equation residual accumulates to an O(1) error at finite time, primarily driven by approximation errors in the G-only transport term. Furthermore, K_{1,\mathrm{EFT}} fails due to the breakdown of the source closure, which exhibits a systematic mismatch even at initialization. These findings highlight the limitations of G-only state-space reduction and suggest extending the state space to incorporate the sigma-kernel.