Hessian-informed machine learning interatomic potential towards bridging theory and experiments

arXiv cs.LG / 3/27/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes a Hessian-informed machine learning interatomic potential (Hi-MLIP) designed to capture local curvature of potential energy surfaces that governs certain experimentally relevant observables.
  • To enable Hessian supervision without prohibitive cost, the authors introduce Hessian INformed Training (HINT), which combines Hessian pre-training, configuration sampling, curriculum learning, and a stochastic projection Hessian loss to cut expensive Hessian label requirements by 2–4 orders of magnitude.
  • Hi-MLIP trained with HINT shows improved transition-state search performance and yields Gibbs free-energy predictions near chemical accuracy, particularly in data-scarce regimes.
  • The approach is demonstrated on strongly anharmonic hydrides, where it reproduces phonon renormalization and superconducting critical temperatures in close agreement with experiment while avoiding the usual computational bottleneck of explicit anharmonic calculations.

Abstract

Local curvature of potential energy surfaces is critical for predicting certain experimental observables of molecules and materials from first principles, yet it remains far beyond reach for complex systems. In this work, we introduce a Hessian-informed Machine Learning Interatomic Potential (Hi-MLIP) that captures such curvature reliably, thereby enabling accurate analysis of associated thermodynamic and kinetic phenomena. To make Hessian supervision practically viable, we develop a highly efficient training protocol, termed Hessian INformed Training (HINT), achieving two to four orders of magnitude reduction for the requirement of expensive Hessian labels. HINT integrates critical techniques, including Hessian pre-training, configuration sampling, curriculum learning and stochastic projection Hessian loss. Enabled by HINT, Hi-MLIP significantly improves transition-state search and brings Gibbs free-energy predictions close to chemical accuracy especially in data-scarce regimes. Our framework also enables accurate treatment of strongly anharmonic hydrides, reproducing phonon renormalization and superconducting critical temperatures in close agreement with experiment while bypassing the computational bottleneck of anharmonic calculations. These results establish a practical route to enhancing curvature awareness of machine learning interatomic potentials, bridging simulation and experimental observables across a wide range of systems.
広告