AI Navigate

Minimum-Action Learning: Energy-Constrained Symbolic Model Selection for Physical Law Identification from Noisy Data

arXiv cs.LG / 3/19/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • Minimum-Action Learning (MAL) identifies physical laws by selecting symbolic force laws from a predefined basis library using a Triple-Action objective that combines trajectory reconstruction, architectural sparsity, and energy-conservation enforcement.
  • A wide-stencil acceleration-matching technique reduces noise variance by about 10,000x, transforming a very low SNR setting into a learnable problem and enabling robust recovery across methods, including SINDy variants.
  • On Kepler gravity and Hooke's law benchmarks, MAL recovers the correct force law with a Kepler exponent p = 3.01 ± 0.01 and a low energy cost around 0.07 kWh, achieving a 40% reduction versus baselines that rely only on prediction error.
  • The raw correct-basis rates are 40% for Kepler and 90% for Hooke, and an energy-conservation criterion yields 100% pipeline-level identification, demonstrating the effectiveness of the conservation diagnostic.
  • The paper also shows the method's robustness to basis library choices, with near-confounders degrading performance and distant additions being harmless, while MAL's energy constraint and dynamical rollout validation differentiate it from SINDy, Hamiltonian Neural Networks, and Lagrangian Neural Networks.

Abstract

Identifying physical laws from noisy observational data is a central challenge in scientific machine learning. We present Minimum-Action Learning (MAL), a framework that selects symbolic force laws from a pre-specified basis library by minimizing a Triple-Action functional combining trajectory reconstruction, architectural sparsity, and energy-conservation enforcement. A wide-stencil acceleration-matching technique reduces noise variance by 10,000x, transforming an intractable problem (SNR ~0.02) into a learnable one (SNR ~1.6); this preprocessing is the critical enabler shared by all methods tested, including SINDy variants. On two benchmarks -- Kepler gravity and Hooke's law -- MAL recovers the correct force law with Kepler exponent p = 3.01 +/- 0.01 at ~0.07 kWh (40% reduction vs. prediction-error-only baselines). The raw correct-basis rate is 40% for Kepler and 90% for Hooke; an energy-conservation-based criterion discriminates the true force law in all cases, yielding 100% pipeline-level identification. Basis library sensitivity experiments show that near-confounders degrade selection (20% with added r^{-2.5} and r^{-1.5}), while distant additions are harmless, and the conservation diagnostic remains informative even when the correct basis is absent. Direct comparison with noise-robust SINDy variants, Hamiltonian Neural Networks, and Lagrangian Neural Networks confirms MAL's distinct niche: interpretable, energy-constrained model selection that combines symbolic basis identification with dynamical rollout validation.