Inverse Neural Operator for ODE Parameter Optimization
arXiv cs.LG / 3/13/2026
📰 NewsModels & Research
Key Points
- The paper introduces Inverse Neural Operator (INO), a two-stage framework to recover hidden ODE parameters from sparse observations.
- Stage 1 uses a Conditional Fourier Neural Operator with cross-attention to reconstruct full ODE trajectories from sparse inputs, employing spectral regularization to suppress high-frequency artifacts.
- Stage 2 uses an Amortized Drifting Model that learns a kernel-weighted velocity field in parameter space to transport random parameter initializations toward the ground truth without backpropagating through the surrogate, avoiding Jacobian instabilities in stiff regimes.
- Experiments on a real-world stiff atmospheric chemistry benchmark (POLLU, 25 parameters) and a synthetic Gene Regulatory Network (GRN, 40 parameters) show INO outperforms gradient-based and amortized baselines in parameter recovery accuracy.
- Inference time is 0.23s, representing a 487x speedup over iterative gradient descent.
Related Articles

PearlOS. We gave swarm intelligence a local desktop environment and code control to self-evolve. Has been pretty incredible to see so far. Open source and free if you want your own.
Reddit r/LocalLLaMA
QwenDean-4B | fine-tuned SLM for UIGen; our first attempt, looking for feedback!
Reddit r/LocalLLaMA
acestep.cpp: portable C++17 implementation of ACE-Step 1.5 music generation using GGML. Runs on CPU, CUDA, ROCm, Metal, Vulkan
Reddit r/LocalLLaMA

**Introducing SPEED-Bench: A Unified and Diverse Benchmark for Speculative Decoding**
Hugging Face Blog

Newest GPU server in the lab! 72gb ampere vram!
Reddit r/LocalLLaMA