No More DeLuLu: Physics-Inspired Kernel Networks for Geometrically-Grounded Neural Computation
arXiv cs.LG / 3/16/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces the yat-product, a physics-inspired kernel operator that combines quadratic alignment with inverse-square proximity and is proven to be a Mercer kernel with a unique RKHS embedding.
- Neural Matter Networks use the yat-product as the sole non-linearity, shifting normalization into the kernel and replacing conventional activation-normalization blocks with a geometrically grounded operation.
- Empirically, NMN-based classifiers match linear baselines on MNIST while showing bounded prototype evolution and superposition robustness, and Aether-GPT2 achieves lower validation loss than GPT-2 with a comparable parameter budget using yat-based attention and MLP blocks.
- The framework is positioned as unifying kernel learning, gradient stability, and information geometry, establishing NMNs as a principled alternative to conventional neural architectures.
Related Articles
How political censorship actually works inside Qwen, DeepSeek, GLM, and Yi: Ablation and behavioral results across 9 models
Reddit r/LocalLLaMA
Engenharia de Prompt: Por Que a Forma Como Você Pergunta Muda Tudo(Um guia introdutório)
Dev.to
The Obligor
Dev.to
The Markup
Dev.to
2026 年 AI 部落格變現完整攻略:從第一篇文章到月收入 $1000
Dev.to