Federated Learning with Hypergradient-based Online Update of Aggregation Weights

arXiv cs.LG / 5/4/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces FedHAW, a federated learning method designed for mobile and IoT settings where client data heterogeneity and changing communication conditions are common.
  • FedHAW performs online updates of the aggregation weights by computing hypergradients (i.e., gradients of the loss with respect to the aggregation weights) with low computational overhead.
  • Simulation experiments indicate that FedHAW achieves strong generalization performance in heterogeneous client environments.
  • The approach also demonstrates robustness to communication errors, suggesting it can better tolerate unreliable or noisy federated training links.

Abstract

Federated learning using mobile and Internet of Things devices requires not only the ability to handle heterogeneity of clients' data distributions but also high adaptability to varying communication environments. We propose FedHAW (Federated Learning with Hypergradient-based update of Aggregation Weights) that implements online updates of aggregation weights. FedHAW updates the aggregation weights by using hypergradient, the gradient of the objective function with respect to the weights, which can be calculated with low computational overhead. Simulation results show that the proposed method possesses high generalization performance in heterogeneous environments and high robustness to communication errors.