Deep Neural Regression Collapse
arXiv cs.LG / 2026/3/26
💬 オピニオンIdeas & Deep AnalysisModels & Research
要点
- The paper extends the concept of Neural Collapse from classification to regression, showing that Neural Regression Collapse (NRC) occurs not only at the last layer but also throughout earlier layers in deep neural regression models.
- It provides evidence that, in the “collapsed” layers, learned features and covariances align with the target’s dimensionality and covariance structure, and that the layer weights’ input subspace matches the feature subspace.
- The authors demonstrate that the linear prediction error of features in collapsed layers closely matches the model’s overall prediction error, indicating the internal representation closely supports the model’s predictions.
- They further show that models exhibiting Deep NRC learn the intrinsic dimension of low-rank targets and analyze the role and necessity of weight decay in inducing Deep NRC.
- Overall, the work delivers a more complete, multi-layer characterization of the simple structure deep networks can learn in regression settings.



