Inductive Convolution Nuclear Norm Minimization for Tensor Completion with Arbitrary Sampling
arXiv cs.CV / 4/21/2026
📰 NewsModels & Research
Key Points
- The paper proposes ICNNM, a new approach to tensor completion under arbitrary sampling (TCAS), extending the earlier Convolution Nuclear Norm Minimization (CNNM) framework.
- CNNM’s optimization is computationally expensive because it requires multiple Singular Value Decompositions (SVDs), and the authors target this bottleneck.
- ICNNM reformulates the objective using convolution eigenvectors and uses pre-learned, shared convolution eigenvectors across tensors to bypass the SVD step.
- The method not only reduces computational time substantially but also improves recovery performance by injecting additional prior knowledge through the pre-learned eigenvectors.
- Experiments on video completion, prediction, and frame interpolation show ICNNM outperforming CNNM and several other baselines.
Related Articles

We built it during the NVIDIA DGX Spark Full-Stack AI Hackathon — and it ended up winning 1st place overall 🏆
Dev.to

Stop Losing Progress: Setting Up a Pro Jupyter Workflow in VS Code (No More Colab Timeouts!)
Dev.to

Building AgentOS: Why I’m Building the AWS Lambda for Insurance Claims
Dev.to

Where we are. In a year, everything has changed. Kimi - Minimax - Qwen - Gemma - GLM
Reddit r/LocalLLaMA
Where is Grok-2 Mini and Grok-3 (mini)?
Reddit r/LocalLLaMA