Pruning-induced phases in fully-connected neural networks: the eumentia, the dementia, and the amentia
arXiv cs.LG / 3/16/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The authors define three phases for dropout-induced pruning in fully-connected networks—eumentia (learning), dementia (forgetting), and amentia (inability to learn)—distinguished by how cross-entropy loss scales with training data size.
- By varying dropout at both training and evaluation on MNIST, they construct a phase diagram showing robust phase boundaries across network widths and depths.
- The transition between eumentia and dementia is accompanied by scale invariance and a diverging length scale, with hallmarks of a Berezinskii-Kosterlitz-Thouless-like transition, linking pruning behavior to statistical mechanics.
- The work suggests pruning-induced neural behavior can be understood through neural scaling laws and universality classes, offering a theoretical lens for model compression.
Related Articles
I Was Wrong About AI Coding Assistants. Here's What Changed My Mind (and What I Built About It).
Dev.to

Interesting loop
Reddit r/LocalLLaMA
Qwen3.5-122B-A10B Uncensored (Aggressive) — GGUF Release + new K_P Quants
Reddit r/LocalLLaMA
A supervisor or "manager" Al agent is the wrong way to control Al
Reddit r/artificial
FeatherOps: Fast fp8 matmul on RDNA3 without native fp8
Reddit r/LocalLLaMA