Inverse Neural Operator for ODE Parameter Optimization
arXiv cs.LG / 3/13/2026
📰 NewsModels & Research
Key Points
- The paper introduces Inverse Neural Operator (INO), a two-stage framework to recover hidden ODE parameters from sparse observations.
- Stage 1 uses a Conditional Fourier Neural Operator with cross-attention to reconstruct full ODE trajectories from sparse inputs, employing spectral regularization to suppress high-frequency artifacts.
- Stage 2 uses an Amortized Drifting Model that learns a kernel-weighted velocity field in parameter space to transport random parameter initializations toward the ground truth without backpropagating through the surrogate, avoiding Jacobian instabilities in stiff regimes.
- Experiments on a real-world stiff atmospheric chemistry benchmark (POLLU, 25 parameters) and a synthetic Gene Regulatory Network (GRN, 40 parameters) show INO outperforms gradient-based and amortized baselines in parameter recovery accuracy.
- Inference time is 0.23s, representing a 487x speedup over iterative gradient descent.
Related Articles
Self-Refining Agents in Spec-Driven Development
Dev.to

has anyone tried this? Flash-MoE: Running a 397B Parameter Model on a Laptop
Reddit r/LocalLLaMA

M2.7 open weights coming in ~2 weeks
Reddit r/LocalLLaMA

MiniMax M2.7 Will Be Open Weights
Reddit r/LocalLLaMA
Best open source coding models for claude code? LB?
Reddit r/LocalLLaMA