Asymptotic Optimism for Tensor Regression Models with Applications to Neural Network Compression
arXiv cs.LG / 3/30/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisTools & Practical UsageModels & Research
Key Points
- The paper analyzes rank selection for low-rank tensor regression with random covariate designs, focusing on how training performance differs from test performance (the “optimism”).
- Under a Gaussian random-design model, it derives population-level expressions of expected training-testing discrepancy for both CP and Tucker tensor decompositions.
- It shows that the optimism is minimized when using the true tensor rank, motivating a prediction-oriented rank selection rule that is consistent with cross-validation and supports tensor-model averaging.
- The authors clarify when and why under-ranked or over-ranked models may still look better, defining the method’s limitations and practical scope.
- They validate the approach on real image regression and extend it to tensor-based compression of neural networks, positioning it as a model-selection tool for deep learning compression.

