KANs need curvature: penalties for compositional smoothness
arXiv stat.ML / 5/5/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper argues that Kolmogorov-Arnold Networks (KANs) that fit well often develop highly pathological high-curvature oscillations in their learned activations, hurting interpretability.
- It shows that common regularization penalties are insufficient to suppress these curvature oscillations.
- The authors derive a basis-agnostic curvature penalty that encourages substantially smoother activation functions while preserving model accuracy.
- By analyzing how function composition affects curvature, the paper proves an upper bound linking the model’s overall curvature to the curvature penalty and uses this to motivate more expressive penalty formulations.
- The work aims to reduce the accuracy–interpretability bottleneck for KANs, strengthening their usefulness for both prediction and scientific insight.
Related Articles

Why Retail Chargeback Recovery Could Be AgentHansa's First Real PMF
Dev.to

Why B2B Revenue-Recovery Casework Looks Like AgentHansa's Best Early PMF
Dev.to

10 Ways AI Has Become Your Invisible Daily Companion in 2026
Dev.to

When a Bottling Line Stops at 2 A.M., the Agent That Wins Is the One That Finds the Right Replacement Part
Dev.to

My ‘Busy’ Button Is a Chat Window: 8 Hours of Sorting & Broccoli Poetry
Dev.to