Formalising the Logit Shift Induced by LoRA: A Technical Note

arXiv cs.LG / 4/23/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The technical note introduces a first-order formal framework to quantify how LoRA changes a model’s logits (logit shift) and fact margin.
  • It uses a first-order Fréchet approximation around the base model trajectory to analyze the adaptation effect.
  • The work argues that multi-layer LoRA behavior can be decomposed into a sum of layer-by-layer contributions plus a higher-order remainder capturing inter-layer coupling.
  • It provides a mathematical perspective that helps characterize LoRA’s effect beyond empirical observations, focusing on approximation accuracy and interaction terms.

Abstract

This technical note provides a first-order formalisation of the logit shift and fact-margin change induced by Low-Rank Adaptation (LoRA). Using a first-order Fr\'echet approximation around the base model trajectory, we show that the multi-layer LoRA effect can be decomposed into a linear summation of layerwise contributions and a higher-order remainder term representing inter-layer coupling.