Inductive Convolution Nuclear Norm Minimization for Tensor Completion with Arbitrary Sampling

arXiv cs.CV / 4/21/2026

📰 NewsModels & Research

Key Points

  • The paper proposes ICNNM, a new approach to tensor completion under arbitrary sampling (TCAS), extending the earlier Convolution Nuclear Norm Minimization (CNNM) framework.
  • CNNM’s optimization is computationally expensive because it requires multiple Singular Value Decompositions (SVDs), and the authors target this bottleneck.
  • ICNNM reformulates the objective using convolution eigenvectors and uses pre-learned, shared convolution eigenvectors across tensors to bypass the SVD step.
  • The method not only reduces computational time substantially but also improves recovery performance by injecting additional prior knowledge through the pre-learned eigenvectors.
  • Experiments on video completion, prediction, and frame interpolation show ICNNM outperforming CNNM and several other baselines.

Abstract

The recently established Convolution Nuclear Norm Minimization (CNNM) addresses the problem of \textit{tensor completion with arbitrary sampling} (TCAS), which involves restoring a tensor from a subset of its entries sampled in an arbitrary manner. Despite its promising performance, the optimization procedure of CNNM needs performing Singular Value Decomposition (SVD) multiple times, which is computationally expensive and hard to parallelize. To address the issue, we reformulate the optimization objective of CNNM from the perspective of convolution eigenvectors. By introducing pre-learned convolution eigenvectors which are shared among different tensors, we propose a novel method called Inductive Convolution Nuclear Norm Minimization (ICNNM), which bypasses the SVD step so as to decrease significantly the computational time. In addition, due to the extra prior knowledge encoded in the pre-learned convolution eigenvectors, ICNNM also outperforms CNNM in terms of recovery performance. Extensive experiments on video completion, prediction and frame interpolation verify the superiority of ICNNM over CNNM and several other competing methods.

Inductive Convolution Nuclear Norm Minimization for Tensor Completion with Arbitrary Sampling | AI Navigate