LipKernel: Lipschitz-Bounded Convolutional Neural Networks via Dissipative Layers
arXiv stat.ML / 4/10/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper introduces LipKernel, a new CNN layer-wise parameterization that enforces prescribed Lipschitz bounds by requiring each layer to satisfy a linear matrix inequality (LMI) guaranteeing dissipativity.
- By using a 2-D Roesser-type state-space model to parameterize dissipative convolution kernels, the trained convolution layers remain in standard form, avoiding evaluation overhead compared with some frequency-domain approaches.
- The authors report that their runtime is orders of magnitude faster than prior state-of-the-art Lipschitz-bounded networks that parameterize convolutions in the Fourier domain.
- LipKernel is presented as more expressive than spectral bound or orthogonal-layer approaches, while also supporting many common CNN components such as 1-D/2-D conv, pooling, striding/dilation, and zero padding.
- The method is positioned as especially useful for robust real-time perception/control tasks in robotics, autonomous vehicles, and automation, and it is claimed to extend beyond CNNs to any incrementally dissipative layer.
Related Articles

GLM 5.1 tops the code arena rankings for open models
Reddit r/LocalLLaMA
can we talk about how AI has gotten really good at lying to you?
Reddit r/artificial

AI just found thousands of zero-days. Your firewall is still pattern-matching from 2014
Dev.to

Emergency Room and the Vanishing Moat
Dev.to

I Built a 100% Browser-Based OCR That Never Uploads Your Documents — Here's How
Dev.to