LTBs-KAN: Linear-Time B-splines Kolmogorov-Arnold Networks

arXiv cs.LG / 4/27/2026

📰 NewsModels & Research

Key Points

  • The paper proposes a new Kolmogorov-Arnold Network variant called LTBs-KAN that improves the practicality of KANs by targeting their main bottleneck: slow computation versus MLPs.
  • LTBs-KAN uses Linear-Time B-spline computation to reduce complexity, avoiding more computationally intensive spline evaluation approaches used in prior work.
  • The method also reduces model parameters during the forward pass via a product-of-sums matrix factorization technique, aiming to keep performance intact.
  • Experiments on MNIST, Fashion-MNIST, and CIFAR-10 show that LTBs-KAN delivers favorable time complexity and parameter reductions compared with other KAN implementations when used as architectural building blocks.

Abstract

Kolmogorov-Arnold Networks (KANs) are a recent neural network architecture offering an alternative to Multilayer Perceptrons (MLPs) with improved explainability and expressibility. However, KANs are significantly slower than MLPs due to the recursive nature of B-spline function computations, limiting their application. This work addresses these issues by proposing a novel base-spline Linear-Time B-splines Kolmogorov-Arnold Network (LTBs-KAN) with linear complexity. Unlike previous methods that rely on the Boor-Mansfield-Cox spline algorithm or other computationally intensive mathematical functions, our approach significantly reduces the computational burden. Additionally, we further reduce model's parameter through product-of-sums matrix factorization in the forward pass without sacrificing performance. Experiments on MNIST, Fashion-MNIST and CIFAR-10 demonstrate that LTBs-KAN achieves good time complexity and parameter reduction, when used as building architectural blocks, compared to other KAN implementations.