Gauge-covariant stochastic neural fields: Stability and finite-width effects
arXiv stat.ML / 4/23/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper presents a gauge-covariant stochastic effective field theory aimed at analyzing stability and finite-width effects in deep neural systems.
- It formulates the theory using commuting classical fields, including a complex matter field, a real Abelian connection field, and a stochastic depth variable, and then derives a functional representation via the Martin–Siggia–Rose–Janssen–de Dominicis framework.
- The authors use a two-replica linear-response setup to compute measures of dynamical behavior—specifically the maximal Lyapunov exponent and the amplification factor near the “edge of chaos.”
- Finite-width effects are treated as perturbative corrections to dressed kernels, and the marginality condition is found to remain unchanged at the considered order for fixed kernel geometry.
- Numerical experiments on finite-width multilayer perceptrons and a linear stochastic effective sector show results consistent with the mean-field instability threshold and predicted low-frequency spectral deformation.
Related Articles

Just what the doctor ordered: how AI could help China bridge the medical resources gap
SCMP Tech
Why don't Automatic speech Recognition models use prompting? [D]
Reddit r/MachineLearning

Automating Advanced Customization in Your Music Studio
Dev.to

CoTracker3: Simpler and Better Point Tracking by Pseudo-Labelling Real Videos
Dev.to

My AI Agent Over-Corrected Itself — So I Built Metabolic Regulation
Dev.to