AI Navigate

Heavy-Tailed Principle Component Analysis

arXiv cs.LG / 3/13/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The authors study PCA under heavy-tailed data using a logarithmic loss that remains well defined without finite moments.
  • They show that the heavy-tailed principal directions coincide with those obtained by standard PCA on the Gaussian generator's covariance, establishing a theoretical link.
  • They propose robust covariance estimators derived from heavy-tailed data and compare them to the empirical covariance and Tyler's scatter estimator.
  • Experimental results, including background denoising tasks, demonstrate the method reliably recovers principal directions and outperforms classical PCA under heavy-tailed/impulsive noise while staying competitive under Gaussian noise.
  • The framework encompasses a wide class of heavy-tailed distributions, such as multivariate t and sub-Gaussian alpha-stable laws, with practical implications for high-dimensional analysis.

Abstract

Principal Component Analysis (PCA) is a cornerstone of dimensionality reduction, yet its classical formulation relies critically on second-order moments and is therefore fragile in the presence of heavy-tailed data and impulsive noise. While numerous robust PCA variants have been proposed, most either assume finite variance, rely on sparsity-driven decompositions, or address robustness through surrogate loss functions without a unified treatment of infinite-variance models. In this paper, we study PCA for high-dimensional data generated according to a superstatistical dependent model of the form \mathbf{X} = A^{1/2}\mathbf{G}, where A is a positive random scalar and \mathbf{G} is a Gaussian vector. This framework captures a wide class of heavy-tailed distributions, including multivariate t and sub-Gaussian \alpha-stable laws. We formulate PCA under a logarithmic loss, which remains well defined even when moments do not exist. Our main theoretical result shows that, under this loss, the principal components of the heavy-tailed observations coincide with those obtained by applying standard PCA to the covariance matrix of the underlying Gaussian generator. Building on this insight, we propose robust estimators for this covariance matrix directly from heavy-tailed data and compare them with the empirical covariance and Tyler's scatter estimator. Extensive experiments, including background denoising tasks, demonstrate that the proposed approach reliably recovers principal directions and significantly outperforms classical PCA in the presence of heavy-tailed and impulsive noise, while remaining competitive under Gaussian noise.