FedPBS: Proximal-Balanced Scaling Federated Learning Model for Robust Personalized Training for Non-IID Data
arXiv cs.LG / 3/17/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- FedPBS is a new federated learning algorithm that combines ideas from FedBS and FedProx to address statistical heterogeneity and uneven client participation in highly non-IID data.
- It dynamically adapts batch sizes to each client's resources to enable balanced and scalable participation, and applies a proximal correction for small-batch clients to stabilize local updates and reduce divergence from the global model.
- In experiments on CIFAR-10 and UCI-HAR under severe non-IID settings, FedPBS consistently outperforms state-of-the-art baselines such as FedBS, FedGA, MOON, and FedProx, with smooth loss curves indicating stable convergence.
- The results suggest that FedPBS provides robust performance gains across diverse federated environments, highlighting its potential for robust personalized training in real-world deployments.
Related Articles

Hey dev.to community – sharing my journey with Prompt Builder, Insta Posts, and practical SEO
Dev.to

How to Build Passive Income with AI in 2026: A Developer's Practical Guide
Dev.to

The Research That Doesn't Exist
Dev.to

Jeff Bezos reportedly wants $100 billion to buy and transform old manufacturing firms with AI
TechCrunch

Krish Naik: AI Learning Path For 2026- Data Science, Generative and Agentic AI Roadmap
Dev.to