Amortized Filtering and Smoothing with Conditional Normalizing Flows
arXiv stat.ML / 4/9/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces AFSF, a unified amortized framework for Bayesian filtering and smoothing in high-dimensional nonlinear dynamical systems using conditional normalizing flows.
- AFSF uses a recurrent encoder to map each observation history to a fixed-dimensional summary statistic (independent of time-series length), conditioning both a forward flow (filtering distribution) and a backward flow (backward transition kernel).
- Smoothing over an entire trajectory is obtained by combining the learned terminal filtering distribution with the learned backward flow via standard backward recursion.
- The approach learns temporal evolution structure and can extrapolate beyond the training horizon, while shared summary statistics between forward and backward flows act as implicit regularization across latent trajectories.
- The authors also propose a flow-based particle filtering variant that supports ESS-based diagnostics when explicit model factors are available, and report numerical experiments showing accurate approximations.
Related Articles

Why Anthropic’s new model has cybersecurity experts rattled
Reddit r/artificial
Does the AI 2027 paper still hold any legitimacy?
Reddit r/artificial

Why Most Productivity Systems Fail (And What to Do Instead)
Dev.to

Moving from proof of concept to production: what we learned with Nometria
Dev.to

Frontend Engineers Are Becoming AI Trainers
Dev.to