Private Minimum Hellinger Distance Estimation via Hellinger Distance Differential Privacy

arXiv stat.ML / 4/22/2026

💬 OpinionModels & Research

Key Points

  • The paper proposes “private minimum Hellinger distance” estimators that protect privacy using a new privacy notion called Hellinger differential privacy.
  • It shows that these estimators maintain the robustness and efficiency benefits typically associated with Hellinger-distance-based objective functions.
  • The authors argue Hellinger differential privacy is closely related to standard differential privacy while enabling sharper inference results.
  • For scalable computation, the work introduces Hellinger-differentially-private versions of gradient descent and Newton-Raphson.
  • Numerical experiments in finite samples confirm robustness under gross-error contamination and illustrate the estimators’ behavior in practice.

Abstract

Objective functions based on Hellinger distance yield robust and efficient estimators of model parameters. Motivated by privacy and regulatory requirements encountered in contemporary applications, we derive in this paper \emph{private minimum Hellinger distance estimators}. The estimators satisfy a new privacy constraint, namely, Hellinger differential privacy, while retaining the robustness and efficiency properties. We demonstrate that Hellinger differential privacy shares several features of standard differential privacy while allowing for sharper inference. Additionally, for computational purposes, we also develop Hellinger differentially private gradient descent and Newton-Raphson algorithms. We illustrate the behavior of our estimators in finite samples using numerical experiments and verify that they retain robustness properties under gross-error contamination.