Perspective: Towards sustainable exploration of chemical spaces with machine learning

arXiv cs.AI / 4/2/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The Perspective argues that AI is accelerating molecular and materials discovery but introduces major sustainability concerns due to rising energy, compute, and infrastructure demands across the discovery pipeline.
  • It analyzes resource costs from quantum-mechanical data generation and model training through automated “self-driving” research workflows, noting that large quantum datasets improve benchmarking while also increasing environmental and operational burdens.
  • The article highlights efficiency strategies such as general-purpose ML models, multi-fidelity methods, model distillation, and active learning to reduce unnecessary computation.
  • It recommends hierarchical workflows that apply fast ML surrogate models broadly and reserve high-accuracy QM calculations for targeted cases, while embedding physics-based constraints to maintain reliability.
  • It emphasizes bridging computational predictions to real-world feasibility via synthesizability and multi-objective criteria, and calls for sustainable progress through open datasets/models and reusable, domain-specific workflows that maximize scientific value per unit of compute.

Abstract

Artificial intelligence is transforming molecular and materials science, but its growing computational and data demands raise critical sustainability challenges. In this Perspective, we examine resource considerations across the AI-driven discovery pipeline--from quantum-mechanical (QM) data generation and model training to automated, self-driving research workflows--building on discussions from the ``SusML workshop: Towards sustainable exploration of chemical spaces with machine learning'' held in Dresden, Germany. In this context, the availability of large quantum datasets has enabled rigorous benchmarking and rapid methodological progress, while also incurring substantial energy and infrastructure costs. We highlight emerging strategies to enhance efficiency, including general-purpose machine learning (ML) models, multi-fidelity approaches, model distillation, and active learning. Moreover, incorporating physics-based constraints within hierarchical workflows, where fast ML surrogates are applied broadly and high-accuracy QM methods are used selectively, can further optimize resource use without compromising reliability. Equally important is bridging the gap between idealized computational predictions and real-world conditions by accounting for synthesizability and multi-objective design criteria, which is essential for practical impact. Finally, we argue that sustainable progress will rely on open data and models, reusable workflows, and domain-specific AI systems that maximize scientific value per unit of computation, enabling efficient and responsible discovery of technological materials and therapeutics.