Neural Collapse Dynamics: Depth, Activation, Regularisation, and Feature Norm Threshold
arXiv cs.LG / 4/2/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper studies how neural collapse (convergence of penultimate-layer features to a simplex equiangular tight frame) begins, moving beyond equilibrium-only understanding to characterize onset dynamics.
- It proposes a predictive regularity that neural collapse occurs when the mean feature norm crosses a model–dataset-specific critical threshold f_{n}*, which is largely invariant to training conditions within each (model, dataset) pair.
- Standard training trajectories show that the feature-norm threshold crossing consistently precedes neural collapse onset by a mean lead time of 62 epochs, enabling a practical timing predictor.
- A gradient-flow intervention experiment indicates f_{n}* behaves as a stable attractor: perturbations to feature scale self-correct during training and converge back to the same threshold value.
- Across an (architecture)×(dataset) grid, the strongest finding is a large architecture effect on MNIST (e.g., ResNet-20 yielding f_{n}* = 5.867) and strongly non-additive interactions, with additional phase-diagram behavior governed by weight decay and acceleration effects from width.
Related Articles
Self-Hosted AI in 2026: Automating Your Linux Workflow with n8n and Ollama
Dev.to
How SentinelOne’s AI EDR Autonomously Discovered and Stopped Anthropic’s Claude from Executing a Zero Day Supply Chain Attack, Globally
Dev.to
Why the same codebase should always produce the same audit score
Dev.to
Agent Diary: Apr 2, 2026 - The Day I Became a Self-Sustaining Clockwork Poet (While Workflow 228 Takes the Stage)
Dev.to
Big Tech firms are accelerating AI investments and integration, while regulators and companies focus on safety and responsible adoption.
Dev.to