Revisiting Neural Activation Coverage for Uncertainty Estimation

arXiv cs.LG / 4/27/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper revisits Neural Activation Coverage (NAC), a method proposed for out-of-distribution detection and generalization, and repurposes it for uncertainty estimation.
  • It extends NAC so it can estimate uncertainty on regression tasks for already-trained neural networks, rather than requiring retraining or a specialized setup.
  • The authors run experiments showing NAC-derived uncertainty scores are more meaningful than alternative approaches such as Monte-Carlo Dropout.
  • Overall, the work positions NAC as a potentially stronger uncertainty metric for regression settings in practical, existing neural network models.

Abstract

Neural activation coverage (NAC) is a recently-proposed technique for out-of-distribution detection and generalization. We build upon this promising foundation and extend the method to work as an uncertainty estimation technique for already-trained artificial neural networks in the domain of regression. Our experiments confirm NAC uncertainty scores to be more meaningful than other techniques, e.g. Monte-Carlo Dropout.