Neuro-Symbolic ODE Discovery with Latent Grammar Flow

arXiv cs.LG / 4/20/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper presents Latent Grammar Flow (LGF), a neuro-symbolic generative framework designed to discover ordinary differential equations (ODEs) from observed data.
  • LGF represents candidate equations in a discrete latent space using grammar-based representations, and it uses a behavioral loss to cluster semantically similar equations closer together.
  • A discrete flow model recursively samples and generates candidate equations that best match the target data.
  • The approach supports incorporating domain knowledge and constraints (e.g., stability) either directly into the grammar rules or via conditional predictors.
  • The motivation is to combine interpretability and transferability typical of symbolic models with learning-based discovery rather than relying on black-box methods.

Abstract

Understanding natural and engineered systems often relies on symbolic formulations, such as differential equations, which provide interpretability and transferability beyond black-box models. We introduce Latent Grammar Flow (LGF), a neuro-symbolic generative framework for discovering ordinary differential equations from data. LGF embeds equations as grammar-based representations into a discrete latent space and forces semantically similar equations to be positioned closer together with a behavioural loss. Then, a discrete flow model guides the sampling process to recursively generate candidate equations that best fit the observed data. Domain knowledge and constraints, such as stability, can be either embedded into the rules or used as conditional predictors.