Deep Neural Regression Collapse
arXiv cs.LG / 3/26/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper extends the concept of Neural Collapse from classification to regression, showing that Neural Regression Collapse (NRC) occurs not only at the last layer but also throughout earlier layers in deep neural regression models.
- It provides evidence that, in the “collapsed” layers, learned features and covariances align with the target’s dimensionality and covariance structure, and that the layer weights’ input subspace matches the feature subspace.
- The authors demonstrate that the linear prediction error of features in collapsed layers closely matches the model’s overall prediction error, indicating the internal representation closely supports the model’s predictions.
- They further show that models exhibiting Deep NRC learn the intrinsic dimension of low-rank targets and analyze the role and necessity of weight decay in inducing Deep NRC.
- Overall, the work delivers a more complete, multi-layer characterization of the simple structure deep networks can learn in regression settings.
Related Articles
AgentDesk vs Hiring Another Consultant: A Cost Comparison
Dev.to
"Why Your AI Agent Needs a System 1"
Dev.to
When should we expect TurboQuant?
Reddit r/LocalLLaMA
AI as Your Customs Co-Pilot: Automating HS Code Chaos in Southeast Asia
Dev.to
The Instruction Hierarchy: Training LLMs to Prioritize Privileged Instructions
Dev.to