FedSIR: Spectral Client Identification and Relabeling for Federated Learning with Noisy Labels

arXiv cs.LG / 4/23/2026

📰 NewsModels & Research

Key Points

  • FedSIR introduces a multi-stage federated learning framework designed to handle noisy labels distributed across clients.
  • Instead of focusing on noise-tolerant loss functions or training dynamics, FedSIR analyzes the spectral structure of client feature representations to detect which clients are clean or noisy.
  • Using clean clients as spectral references, the method relabels samples on noisy clients by combining dominant class directions with residual subspaces.
  • FedSIR further stabilizes federated optimization with a noise-aware training strategy that combines logit-adjusted loss, knowledge distillation, and distance-aware aggregation.
  • Experiments on standard federated learning benchmarks show FedSIR outperforms existing state-of-the-art approaches for learning with noisy labels, and the authors provide code on GitHub.

Abstract

Federated learning (FL) enables collaborative model training without sharing raw data; however, the presence of noisy labels across distributed clients can severely degrade the learning performance. In this paper, we propose FedSIR, a multi-stage framework for robust FL under noisy labels. Different from existing approaches that mainly rely on designing noise-tolerant loss functions or exploiting loss dynamics during training, our method leverages the spectral structure of client feature representations to identify and mitigate label noise. Our framework consists of three key components. First, we identify clean and noisy clients by analyzing the spectral consistency of class-wise feature subspaces with minimal communication overhead. Second, clean clients provide spectral references that enable noisy clients to relabel potentially corrupted samples using both dominant class directions and residual subspaces. Third, we employ a noise-aware training strategy that integrates logit-adjusted loss, knowledge distillation, and distance-aware aggregation to further stabilize federated optimization. Extensive experiments on standard FL benchmarks demonstrate that FedSIR consistently outperforms state-of-the-art methods for FL with noisy labels. The code is available at https://github.com/sinagh72/FedSIR.