Learning Representations for Independence Testing

arXiv stat.ML / 3/23/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper studies two related approaches to learning powerful independence tests: using variational estimators of mutual information (such as InfoNCE and NWJ) to obtain finite-sample-valid tests.
  • It establishes a close connection between variational mutual information-based tests and HSIC-based tests, showing that learning a variational bound for mutual information is closely related to learning a kernel for HSIC.
  • The authors propose the Neural Dependency Statistic (NDS), which focuses on learning representations to maximize test power rather than maximizing the statistic itself.
  • They address misconceptions in HSIC power optimization and extend to deep kernels; experiments show that optimized HSIC tests with exact level control generally outperform other approaches on challenging problems involving structured dependence.

Abstract

Many tools exist to detect dependence between random variables, a core question across a wide range of machine learning, statistical, and scientific endeavors. Although several statistical tests guarantee eventual detection of any dependence with enough samples, standard tests may require an exorbitant amount of samples for detecting subtle dependencies between high-dimensional random variables with complex distributions. In this work, we study two related ways to learn powerful independence tests. First, we show how to construct powerful statistical tests with finite-sample validity by using variational estimators of mutual information, such as the InfoNCE or NWJ estimators. Second, we establish a close connection between these variational mutual information-based tests and tests based on the Hilbert-Schmidt Independence Criterion (HSIC); in particular, learning a variational bound (typically parameterized by a deep network) for mutual information is closely related to learning a kernel for HSIC. Finally, we show how to, rather than selecting a representation to maximize the statistic itself, select a representation which can maximize the power of a test, in either setting; we term the former case a Neural Dependency Statistic (NDS). While HSIC power optimization has been recently considered in the literature, we correct some important misconceptions and expand to considering deep kernels. In our experiments, while all approaches can yield powerful tests with exact level control, optimized HSIC tests generally outperform the other approaches on difficult problems of detecting structured dependence.
広告

Learning Representations for Independence Testing | AI Navigate