BEVMAPMATCH: Multimodal BEV Neural Map Matching for Robust Re-Localization of Autonomous Vehicles

arXiv cs.CV / 3/30/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • BEVMapMatch is introduced as a framework for robust autonomous-vehicle re-localization in GNSS-denied or GNSS-degraded settings without relying on GNSS priors.
  • The method fuses lidar and camera inputs to produce context-aware multimodal BEV (Bird’s Eye View) segmentations that work in both good and adverse weather conditions.
  • A cross-attention-based search retrieves candidate map patches from a known map, and the best candidate is then refined for finer global alignment using the generated BEV segmentations.
  • The approach improves accuracy by leveraging multiple frames of BEV segmentation, achieving a reported Recall@1m of 39.8%, nearly double the best baseline.
  • The authors state that the code and data will be released via the provided GitHub repository link.

Abstract

Localization in GNSS-denied and GNSS-degraded environments is a challenge for the safe widespread deployment of autonomous vehicles. Such GNSS-challenged environments require alternative methods for robust localization. In this work, we propose BEVMapMatch, a framework for robust vehicle re-localization on a known map without the need for GNSS priors. BEVMapMatch uses a context-aware lidar+camera fusion method to generate multimodal Bird's Eye View (BEV) segmentations around the ego vehicle in both good and adverse weather conditions. Leveraging a search mechanism based on cross-attention, the generated BEV segmentation maps are then used for the retrieval of candidate map patches for map-matching purposes. Finally, BEVMapMatch uses the top retrieved candidate for finer alignment against the generated BEV segmentation, achieving accurate global localization without the need for GNSS. Multiple frames of generated BEV segmentation further improve localization accuracy. Extensive evaluations show that BEVMapMatch outperforms existing methods for re-localization in GNSS-denied and adverse environments, with a Recall@1m of 39.8%, being nearly twice as much as the best performing re-localization baseline. Our code and data will be made available at https://github.com/ssuralcmu/BEVMapMatch.git.