Towards A Generative Protein Evolution Machine with DPLM-Evo

arXiv cs.LG / 5/4/2026

📰 NewsModels & Research

Key Points

  • The paper introduces DPLM-Evo, a discrete diffusion framework for protein language modeling that aims to better reflect how proteins evolve through accumulated substitutions and indels rather than mask-based generation.
  • DPLM-Evo explicitly predicts substitution, insertion, and deletion operations during denoising, improving suitability for both post-editing/optimization and flexible guided generation.
  • By decoupling an upsampled-length latent alignment space from the variable-length observed sequence space, the method makes indel-aware, variable-length generation feasible with little additional compute.
  • The authors propose a contextualized evolutionary noising kernel to generate biologically informed, context-dependent mutation patterns, improving realism of substitution behavior.
  • Experiments show improved sequence understanding and state-of-the-art single-sequence mutation effect prediction on ProteinGym, along with support for variable-length simulated evolution and edit-trajectory optimization of existing proteins.

Abstract

Proteins are shaped by gradual evolution under biophysical and functional constraints. Protein language models learn rich evolutionary constraints from large-scale sequences, and discrete diffusion-based protein language models~(\eg, DPLMs) are promising for both understanding and generation. However, existing DPLMs typically rely on masking-based absorbing diffusion that contradicts a simple biological intuition: proteins evolve through accumulated edits, not by emerging from masks. Consequently, these frameworks lack explicit pretraining objectives for substitution and insertion/deletion (indel) operations, limiting both optimization-style post-editing and flexible guided generation. To address these limitations, we present DPLM-Evo, an evolutionary discrete diffusion framework that explicitly predicts substitution, insertion, and deletion operations during denoising. DPLM-Evo decouples an upsampled-length latent alignment space from the variable-length observed sequence space, which makes indel-aware generation tractable and enables adaptive scaffold growth throughout the process with negligible computational overhead. To better align substitutions with real evolution, we further introduce a contextualized evolutionary noising kernel that produces biologically informed, context-dependent mutation patterns. Across tasks, DPLM-Evo improves sequence understanding and achieves state-of-the-art mutation effect prediction performance on ProteinGym in the single-sequence setting. It also enables variable-length simulated evolution, and post-editing/optimization of existing proteins via explicit edit trajectories.

Towards A Generative Protein Evolution Machine with DPLM-Evo | AI Navigate