FairTree: Subgroup Fairness Auditing of Machine Learning Models with Bias-Variance Decomposition
arXiv cs.LG / 4/22/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The paper argues that standard loss-based evaluation can miss performance shifts affecting specific subgroups, motivating better fairness auditing methods.
- It introduces FairTree, an algorithm for subgroup fairness auditing that can directly handle continuous, categorical, and ordinal features without discretization.
- FairTree extends prior auditing ideas by decomposing performance disparities into systematic bias and variance, enabling a clearer interpretation of why subgroup performance changes.
- The authors propose two variants—a permutation-based method and a fluctuation test—and simulation results show both have acceptable false-positive rates, with the fluctuation approach achieving higher power.
- They demonstrate the approach on the UCI Adult Census dataset, suggesting the framework can support statistical evaluation of fairness even with relatively small datasets.
![AI TikTok Marketing for Pet Brands [2026 Guide]](/_next/image?url=https%3A%2F%2Fmedia2.dev.to%2Fdynamic%2Fimage%2Fwidth%3D1200%2Cheight%3D627%2Cfit%3Dcover%2Cgravity%3Dauto%2Cformat%3Dauto%2Fhttps%253A%252F%252Fdev-to-uploads.s3.amazonaws.com%252Fuploads%252Farticles%252Fj35r9qm34d68qf2gq7no.png&w=3840&q=75)


