Evaluating Singular Value Thresholds for DNN Weight Matrices based on Random Matrix Theory

arXiv stat.ML / 4/10/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies how to choose singular-value thresholds for low-rank approximations of DNN weight matrices using random matrix theory to separate signal from noise components.
  • It models each weight matrix as the sum of a signal matrix plus a noise matrix, then removes “noise-related” singular values to obtain the low-rank approximation.
  • To validate whether a threshold is appropriate, the authors introduce an evaluation metric based on cosine similarity between singular vectors of the inferred signal and the original weight matrix.
  • Numerical experiments compare two threshold estimation methods using the proposed cosine-similarity metric to judge approximation adequacy.

Abstract

This study evaluates thresholds for removing singular values from singular value decomposition-based low-rank approximations of deep neural network weight matrices. Each weight matrix is modeled as the sum of signal and noise matrices. The low-rank approximation is obtained by removing noise-related singular values using a threshold based on random matrix theory. To assess the adequacy of this threshold, we propose an evaluation metric based on the cosine similarity between the singular vectors of the signal and original weight matrices. The proposed metric is used in numerical experiments to compare two threshold estimation methods.

Evaluating Singular Value Thresholds for DNN Weight Matrices based on Random Matrix Theory | AI Navigate