Cross-Granularity Representations for Biological Sequences: Insights from ESM and BiGCARP

arXiv cs.LG / 2026/3/24

💬 オピニオンSignals & Early TrendsIdeas & Deep AnalysisModels & Research

要点

  • The paper explores how to integrate cross-granularity representations in biological sequence foundation models, contrasting symbolic granularity in language with hierarchical granularity in biology (nucleotides, amino acids, domains, genes).
  • Using BiGCARP (Pfam domain-level) and ESM (amino-acid-level), the authors find that naive cross-model embedding initialization can fail, while deeper-layer embeddings better capture contextual, faithful knowledge transfer.
  • Representation analysis and probe tasks show that different granularity levels encode complementary biological information rather than redundant signals.
  • The study demonstrates that combining representations across granularities produces measurable gains on intermediate-level prediction tasks and can improve interpretability.
  • Overall, the work positions cross-granularity integration as a promising strategy for advancing biological foundation model performance and analysis.

Abstract

Recent advances in general-purpose foundation models have stimulated the development of large biological sequence models. While natural language shows symbolic granularity (characters, words, sentences), biological sequences exhibit hierarchical granularity whose levels (nucleotides, amino acids, protein domains, genes) further encode biologically functional information. In this paper, we investigate the integration of cross-granularity knowledge from models through a case study of BiGCARP, a Pfam domain-level model for biosynthetic gene clusters, and ESM, an amino acid-level protein language model. Using representation analysis tools and a set of probe tasks, we first explain why a straightforward cross-model embedding initialization fails to improve downstream performance in BiGCARP, and show that deeper-layer embeddings capture a more contextual and faithful representation of the model's learned knowledge. Furthermore, we demonstrate that representations at different granularities encode complementary biological knowledge, and that combining them yields measurable performance gains in intermediate-level prediction tasks. Our findings highlight cross-granularity integration as a promising strategy for improving both the performance and interpretability of biological foundation models.

Cross-Granularity Representations for Biological Sequences: Insights from ESM and BiGCARP | AI Navigate