AI Navigate

As Language Models Scale, Low-order Linear Depth Dynamics Emerge

arXiv cs.LG / 3/16/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • A 32-dimensional linear surrogate can accurately reproduce the layerwise sensitivity profile of GPT-2-large across multiple tasks such as toxicity, irony, hate speech, and sentiment.
  • The surrogate reveals how the final output shifts when small additive injections are made at each layer, enabling precise, interpretable analysis of depth dynamics.
  • The authors uncover a scaling principle: for a fixed-order surrogate, agreement with the full model improves monotonically as model size increases across the GPT-2 family.
  • The linear surrogate enables principled multi-layer interventions that use less energy than standard heuristic schedules when applied to the full model.
  • Together, the results suggest that as language models scale, low-order linear depth dynamics emerge, providing a systems-theoretic basis for analysis and control.

Abstract

Large language models are often viewed as high-dimensional nonlinear systems and treated as black boxes. Here, we show that transformer depth dynamics admit accurate low-order linear surrogates within context. Across tasks including toxicity, irony, hate speech and sentiment, a 32-dimensional linear surrogate reproduces the layerwise sensitivity profile of GPT-2-large with near-perfect agreement, capturing how the final output shifts under additive injections at each layer. We then uncover a surprising scaling principle: for a fixed-order linear surrogate, agreement with the full model improves monotonically with model size across the GPT-2 family. This linear surrogate also enables principled multi-layer interventions that require less energy than standard heuristic schedules when applied to the full model. Together, our results reveal that as language models scale, low-order linear depth dynamics emerge within contexts, offering a systems-theoretic foundation for analyzing and controlling them.