AI Navigate

Understanding the Theoretical Foundations of Deep Neural Networks through Differential Equations

arXiv cs.AI / 3/20/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The survey proposes differential equations as a principled theoretical foundation for understanding, analyzing, and improving deep neural networks.
  • It presents two perspectives—the model level, viewing the whole network as a differential equation, and the layer level, modeling individual components as differential equations.
  • The paper explains how tools from differential equations can be used to guide architecture design and performance enhancement in a principled way.
  • It discusses real-world applications, along with key challenges and opportunities for future research in grounding DNNs in differential equations.

Abstract

Deep neural networks (DNNs) have achieved remarkable empirical success, yet the absence of a principled theoretical foundation continues to hinder their systematic development. In this survey, we present differential equations as a theoretical foundation for understanding, analyzing, and improving DNNs. We organize the discussion around three guiding questions: i) how differential equations offer a principled understanding of DNN architectures, ii) how tools from differential equations can be used to improve DNN performance in a principled way, and iii) what real-world applications benefit from grounding DNNs in differential equations. We adopt a two-fold perspective spanning the model level, which interprets the whole DNN as a differential equation, and the layer level, which models individual DNN components as differential equations. From these two perspectives, we review how this framework connects model design, theoretical analysis, and performance improvement. We further discuss real-world applications, as well as key challenges and opportunities for future research.