Dense Neural Networks are not Universal Approximators
arXiv stat.ML / 4/17/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper challenges the usual “universal approximation” intuition by proving that dense (fully connected) neural networks are not universal approximators under realistic constraints on weights and network dimensions.
- It uses a model-compression style argument, combining the weak regularity lemma with a reinterpretation of feedforward networks as message-passing graph neural networks.
- For ReLU networks with constrained weight values and inputs/outputs, the authors show there exist Lipschitz-continuous functions that cannot be approximated by these dense architectures.
- The results indicate intrinsic limitations of dense layers and motivate sparse connectivity as an essential ingredient to recover universality-like approximation behavior.
- Overall, the work reframes approximation capability as being strongly dependent on architectural restrictions rather than just network size.


![[2026] OpenTelemetry for LLM Observability — Self-Hosted Setup](/_next/image?url=https%3A%2F%2Fmedia2.dev.to%2Fdynamic%2Fimage%2Fwidth%3D1200%2Cheight%3D627%2Cfit%3Dcover%2Cgravity%3Dauto%2Cformat%3Dauto%2Fhttps%253A%252F%252Fdev-to-uploads.s3.amazonaws.com%252Fuploads%252Farticles%252Flu4b6ttuhur71z5gemm0.png&w=3840&q=75)
