Batch Normalization for Neural Networks on Complex Domains

arXiv cs.LG / 5/4/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper builds on Riemannian neural networks by developing Riemannian batch normalization (BN) layers tailored to neural networks operating on complex domains.
  • It proposes BN layer variants that are closely related to prior Riemannian BN approaches while extending the theory and practical components needed for domains that have been less studied.
  • A key contribution is deriving implementation-ready elements for specific complex geometries, such as the Siegel disk domain.
  • Experiments across multiple tasks—radar clutter classification, node classification, and action recognition—show improved training stability and accuracy benefits from the proposed approach.

Abstract

Riemannian neural networks have proven effective in solving a variety of machine learning tasks. The key to their success lies in the development of principled Riemannian analogs of fundamental building blocks in deep neural networks (DNNs). Among those, Riemannian batch normalization (BN) layers have shown to enhance training stability and improve accuracy. In this paper, we propose BN layers for neural networks on complex domains. The proposed layers have close connections with existing Riemannian BN layers. We derive essential components for practical implementations of BN layers on some complex domains which are less studied in previous works, e.g., the Siegel disk domain. We conduct experiments on radar clutter classification, node classification, and action recognition demonstrating the efficacy of our method.