Stochastic-Dimension Frozen Sampled Neural Network for High-Dimensional Gross-Pitaevskii Equations on Unbounded Domains
arXiv cs.LG / 4/13/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces a stochastic-dimension frozen sampled neural network (SD-FSNN) aimed at solving high-dimensional Gross-Pitaevskii equations (GPEs) defined on unbounded spatial domains.
- SD-FSNN is designed to be unbiased across dimensions and to keep computational cost independent of dimensionality, avoiding the exponential scaling typical of Hermite-basis discretizations.
- By randomly sampling hidden layer weights and biases, the method reduces reliance on slow iterative gradient-based training and achieves improved training time and accuracy.
- A space-time separation approach is combined with adaptive ODE solvers to update evolution coefficients while maintaining temporal causality in the learned dynamics.
- The network incorporates physics-informed components—Gaussian-weighted ansatz for correct decay at infinity, a normalization projection layer for mass normalization, and an energy conservation constraint to limit long-time numerical dissipation—showing strong comparative performance on accuracy and efficiency.
Related Articles

Why Fashion Trend Prediction Isn’t Enough Without Generative AI
Dev.to
Chatbot vs Voicebot: The Real Business Decision Nobody Talks About
Dev.to
วิธีใช้ AI ทำ SEO ให้เว็บติดอันดับ Google (2026)
Dev.to

Free AI Tools With No Message Limits — The Definitive List (2026)
Dev.to
Why Domain Knowledge Is Critical in Healthcare Machine Learning
Dev.to