DRUM: Diffusion-based Raydrop-aware Unpaired Mapping for Sim2Real LiDAR Segmentation

arXiv cs.CV / 3/30/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper introduces DRUM, a diffusion-based Sim2Real translation framework aimed at improving LiDAR semantic segmentation when labeled data is plentiful in simulation but scarce in real-world environments.
  • DRUM uses a diffusion model pre-trained on unlabeled real LiDAR data as a generative prior and translates synthetic samples by matching real measurement characteristics such as reflectance intensity and raydrop noise.
  • To enhance the realism of translated samples, the method adds a raydrop-aware masked guidance mechanism that enforces consistency with the synthetic input while still preserving realistic raydrop noise from the diffusion prior.
  • Experiments indicate DRUM yields consistent Sim2Real performance gains across multiple LiDAR data representations, addressing the data-level domain gap between simulated and real sensors.

Abstract

LiDAR-based semantic segmentation is a key component for autonomous mobile robots, yet large-scale annotation of LiDAR point clouds is prohibitively expensive and time-consuming. Although simulators can provide labeled synthetic data, models trained on synthetic data often underperform on real-world data due to a data-level domain gap. To address this issue, we propose DRUM, a novel Sim2Real translation framework. We leverage a diffusion model pre-trained on unlabeled real-world data as a generative prior and translate synthetic data by reproducing two key measurement characteristics: reflectance intensity and raydrop noise. To improve sample fidelity, we introduce a raydrop-aware masked guidance mechanism that selectively enforces consistency with the input synthetic data while preserving realistic raydrop noise induced by the diffusion prior. Experimental results demonstrate that DRUM consistently improves Sim2Real performance across multiple representations of LiDAR data. The project page is available at https://miya-tomoya.github.io/drum.