Edgeworth Accountant: An Analytical Approach to Differential Privacy Composition

arXiv stat.ML / 4/8/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper proposes the Edgeworth Accountant, an analytical method for computing overall privacy loss when multiple differential-privacy mechanisms are composed.
  • It uses the $f$-differential privacy framework and privacy-loss log-likelihood ratios (PLLRs) to derive closed-form, non-asymptotic $(\epsilon, \delta)$ privacy guarantees.
  • By applying Edgeworth expansion to estimate the distribution of the summed PLLRs, the approach provides accurate privacy accounting, including tight upper and lower bounds.
  • The authors show the method can work for essentially any noise-addition mechanism by simplifying complex distributions, aiming to avoid the computational blow-up seen in some prior composition techniques.
  • The paper highlights potential value for training private models in deep learning and for federated analytics, where many DP building blocks are typically composed.

Abstract

In privacy-preserving data analysis, many procedures and algorithms are structured as compositions of multiple private building blocks. As such, an important question is how to efficiently compute the overall privacy loss under composition. This paper introduces the Edgeworth Accountant, an analytical approach to composing differential privacy guarantees for private algorithms. Leveraging the f-differential privacy framework, the Edgeworth Accountant accurately tracks privacy loss under composition, enabling a closed-form expression of privacy guarantees through privacy-loss log-likelihood ratios (PLLRs). As implied by its name, this method applies the Edgeworth expansion to estimate and define the probability distribution of the sum of the PLLRs. Furthermore, by using a technique that simplifies complex distributions into simpler ones, we demonstrate the Edgeworth Accountant's applicability to any noise-addition mechanism. Its main advantage is providing (\epsilon, \delta)-differential privacy bounds that are non-asymptotic and do not significantly increase computational cost. This feature sets it apart from previous approaches, in which the running time increases with the number of mechanisms under composition. We conclude by showing how our Edgeworth Accountant offers accurate estimates and tight upper and lower bounds on (\epsilon, \delta)-differential privacy guarantees, especially tailored for training private models in deep learning and federated analytics.