HyDRA: Hybrid Domain-Aware Robust Architecture for Heterogeneous Collaborative Perception

arXiv cs.CV / 3/26/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper presents HyDRA, a unified hybrid domain-aware pipeline for collaborative perception that targets performance drops caused by heterogeneous agents with different architectures or data distributions.
  • HyDRA uses a lightweight domain classifier to detect heterogeneous agents and route them into a late-fusion branch while integrating intermediate fusion as well.
  • To counter localization errors typical of late fusion, it introduces anchor-guided pose graph optimization that treats reliable intermediate-fusion detections as fixed spatial anchors.
  • The authors report extensive experimental results showing HyDRA matches state-of-the-art heterogeneity-aware collaborative perception methods without additional training.
  • The method is claimed to scale “at zero cost” as more agents collaborate, maintaining performance without retraining.

Abstract

In collaborative perception, an agent's performance can be degraded by heterogeneity arising from differences in model architecture or training data distributions. To address this challenge, we propose HyDRA (Hybrid Domain-Aware Robust Architecture), a unified pipeline that integrates intermediate and late fusion within a domain-aware framework. We introduce a lightweight domain classifier that dynamically identifies heterogeneous agents and assigns them to the late-fusion branch. Furthermore, we propose anchor-guided pose graph optimization to mitigate localization errors inherent in late fusion, leveraging reliable detections from intermediate fusion as fixed spatial anchors. Extensive experiments demonstrate that, despite requiring no additional training, HyDRA achieves performance comparable to state-of-the-art heterogeneity-aware CP methods. Importantly, this performance is maintained as the number of collaborating agents increases, enabling zero-cost scaling without retraining.