Federated Active Learning Under Extreme Non-IID and Global Class Imbalance
arXiv cs.LG / 3/12/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The study shows that in federated active learning with extreme non-IID data and global class imbalance, choosing query models that promote more class-balanced sampling—especially for minority classes—yields better final performance.
- Global-model querying is beneficial only when the global distribution is highly imbalanced and client data are relatively homogeneous; otherwise, local models are preferable.
- The authors propose FairFAL, an adaptive class-fair FAL framework that decides between global and local queries based on lightweight prediction discrepancy.
- FairFAL also uses prototype-guided pseudo-labeling with global features to promote class-aware querying.
- It employs a two-stage uncertainty-diversity balanced sampling with k-center refinement, and experiments on five benchmarks show consistent improvements over state-of-the-art in long-tailed, non-IID settings; the code is available on GitHub.
Related Articles

The programming passion is melting
Dev.to

Maximize Developer Revenue with Monetzly's Innovative API for AI Conversations
Dev.to
Co-Activation Pattern Detection for Prompt Injection: A Mechanistic Interpretability Approach Using Sparse Autoencoders
Reddit r/LocalLLaMA

How to Train Custom Language Models: Fine-Tuning vs Training From Scratch (2026)
Dev.to

KoboldCpp 1.110 - 3 YR Anniversary Edition, native music gen, qwen3tts voice cloning and more
Reddit r/LocalLLaMA