Two-Dimensional Deep ReLU CNN Approximation for Korobov Functions: A Constructive Approach

arXiv stat.ML / 4/20/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper analyzes how well two-dimensional deep CNNs can approximate Korobov functions as a benchmark problem.
  • It studies a specific 2D CNN architecture: multi-channel convolution layers with zero-padding and ReLU activations, followed by a fully connected layer.
  • The authors introduce a fully constructive method to build 2D CNNs for approximating Korobov functions, along with a rigorous complexity analysis of the resulting networks.
  • Results show the CNNs achieve near-optimal approximation rates under the continuous weight selection model, helping to mitigate the curse of dimensionality.
  • Overall, the work lays theoretical groundwork for using 2D CNNs in function approximation and suggests broader applicability beyond the benchmark.

Abstract

This paper investigates approximation capabilities of two-dimensional (2D) deep convolutional neural networks (CNNs), with Korobov functions serving as a benchmark. We focus on 2D CNNs, comprising multi-channel convolutional layers with zero-padding and ReLU activations, followed by a fully connected layer. We propose a fully constructive approach for building 2D CNNs to approximate Korobov functions and provide a rigorous analysis of the complexity of the constructed networks. Our results demonstrate that 2D CNNs achieve near-optimal approximation rates under the continuous weight selection model, significantly alleviating the curse of dimensionality. This work provides a solid theoretical foundation for 2D CNNs and illustrates their potential for broader applications in function approximation.