AI Navigate

From ex(p) to poly: Gaussian Splatting with Polynomial Kernels

arXiv cs.LG / 3/20/2026

📰 NewsIdeas & Deep AnalysisTools & Practical UsageModels & Research

Key Points

  • Replaces the original exponential kernel in Gaussian Splatting with a polynomial approximation combined with a ReLU, improving computational efficiency.
  • Maintains compatibility with existing datasets optimized for the original Gaussian kernel to ease adoption.
  • Reports 4-15% performance gains across 3DGS implementations with negligible impact on image quality.
  • Provides a mathematical analysis of the new kernel and discusses potential benefits for 3DGS on NPU hardware.

Abstract

Recent advancements in Gaussian Splatting (3DGS) have introduced various modifications to the original kernel, resulting in significant performance improvements. However, many of these kernel changes are incompatible with existing datasets optimized for the original Gaussian kernel, presenting a challenge for widespread adoption. In this work, we address this challenge by proposing an alternative kernel that maintains compatibility with existing datasets while improving computational efficiency. Specifically, we replace the original exponential kernel with a polynomial approximation combined with a ReLU function. This modification allows for more aggressive culling of Gaussians, leading to enhanced performance across different 3DGS implementations. Our results show a notable performance improvement of 4 to 15% with negligible impact on image quality. We also provide a detailed mathematical analysis of the new kernel and discuss its potential benefits for 3DGS implementations on NPU hardware.