Joint Surrogate Learning of Objectives, Constraints, and Sensitivities for Efficient Multi-objective Optimization of Neural Dynamical Systems

arXiv cs.LG / 3/24/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper presents DMOSOPT, a scalable framework for efficiently optimizing neural dynamical systems with multiple objectives under many constraints that create a hard feasible/infeasible boundary with little or no usable gradient signal.
  • DMOSOPT uses a single jointly learned surrogate model to approximate both the objective landscape and the feasibility boundary, enabling a unified gradient that simultaneously improves objective values and increases constraint satisfaction.
  • The approach also extracts partial derivatives from the surrogate to estimate per-parameter sensitivities, supporting more targeted and efficient exploration of high-dimensional parameter spaces.
  • Experiments span from single-cell dynamics to population-level neural network activity, including staged validation across an end-to-end neural circuit modeling workflow.
  • The authors report that DMOSOPT achieves efficient optimization at supercomputing scale with substantially fewer evaluations, and note the method is generally applicable to constrained multi-objective optimization beyond computational neuroscience.

Abstract

Biophysical neural system simulations are among the most computationally demanding scientific applications, and their optimization requires navigating high-dimensional parameter spaces under numerous constraints that impose a binary feasible/infeasible partition with no gradient signal to guide the search. Here, we introduce DMOSOPT, a scalable optimization framework that leverages a unified, jointly learned surrogate model to capture the interplay between objectives, constraints, and parameter sensitivities. By learning a smooth approximation of both the objective landscape and the feasibility boundary, the joint surrogate provides a unified gradient that simultaneously steers the search toward improved objective values and greater constraint satisfaction, while its partial derivatives yield per-parameter sensitivity estimates that enable more targeted exploration. We validate the framework from single-cell dynamics to population-level network activity, spanning incremental stages of a neural circuit modeling workflow, and demonstrate efficient, effective optimization of highly constrained problems at supercomputing scale with substantially fewer problem evaluations. While motivated by and demonstrated in the context of computational neuroscience, the framework is general and applicable to constrained multi-objective optimization problems across scientific and engineering domains.