Any-Subgroup Equivariant Networks via Symmetry Breaking

arXiv cs.LG / 3/23/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces the Any-Subgroup Equivariant Network (ASEN), a single model that can be simultaneously equivariant to multiple permutation subgroups by modulating an auxiliary input feature.
  • They start from a fully permutation-equivariant base model and obtain subgroup equivariance by using symmetry-breaking inputs whose automorphism group matches the target subgroup, addressing the challenge of finding inputs with the desired automorphism group.
  • To make the approach practical, they relax exact symmetry breaking to approximate symmetry breaking using the 2-closure concept to derive fast, scalable algorithms.
  • Theoretically, they show that subgroup-equivariant networks can simulate equivariant MLPs and achieve universality if the base model is universal, with empirical validation on graph and image tasks as well as multitask and transfer learning.

Abstract

The inclusion of symmetries as an inductive bias, known as equivariance, often improves generalization on geometric data (e.g. grids, sets, and graphs). However, equivariant architectures are usually highly constrained, designed for symmetries chosen a priori, and not applicable to datasets with other symmetries. This precludes the development of flexible, multi-modal foundation models capable of processing diverse data equivariantly. In this work, we build a single model -- the Any-Subgroup Equivariant Network (ASEN) -- that can be simultaneously equivariant to several groups, simply by modulating a certain auxiliary input feature. In particular, we start with a fully permutation-equivariant base model, and then obtain subgroup equivariance by using a symmetry-breaking input whose automorphism group is that subgroup. However, finding an input with the desired automorphism group is computationally hard. We overcome this by relaxing from exact to approximate symmetry breaking, leveraging the notion of 2-closure to derive fast algorithms. Theoretically, we show that our subgroup-equivariant networks can simulate equivariant MLPs, and their universality can be guaranteed if the base model is universal. Empirically, we validate our method on symmetry selection for graph and image tasks, as well as multitask and transfer learning for sequence tasks, showing that a single network equivariant to multiple permutation subgroups outperforms both separate equivariant models and a single non-equivariant model.