DIDLM: A SLAM Dataset for Difficult Scenarios Featuring Infrared, Depth Cameras, LIDAR, 4D Radar, and Others under Adverse Weather, Low Light Conditions, and Rough Roads
arXiv cs.RO / 3/26/2026
📰 NewsIdeas & Deep AnalysisTools & Practical UsageModels & Research
Key Points
- The paper introduces DIDLM, a multi-sensor SLAM dataset specifically designed for difficult scenarios including adverse weather (snow/rain), low-light/nighttime conditions, and bumpy/rough roads.
- Unlike many existing datasets that focus on single sensors or limited combinations, DIDLM includes rarely used modalities for extreme conditions such as 4D millimeter-wave radar, infrared cameras, and depth cameras, alongside RGB cameras, 3D LiDAR, GPS, and IMU.
- The dataset provides reliable GPS/INS ground truth for both autonomous driving and ground-robot use, covering structured and semi-structured terrains.
- Experiments evaluate multiple SLAM algorithm inputs across modalities (RGB/IR/depth images, LiDAR, and 4D radar) to assess performance under these challenging environments.
- DIDLM totals about 18.5 km of data (69 minutes, ~660 GB) and is publicly released via the project GitHub.
Related Articles
Regulating Prompt Markets: Securities Law, Intellectual Property, and the Trading of Prompt Assets
Dev.to
Mercor competitor Deccan AI raises $25M, sources experts from India
Dev.to
How We Got Local MCP Servers Working in Claude Cowork (The Missing Guide)
Dev.to
How Should Students Document AI Usage in Academic Work?
Dev.to
I built a PWA fitness tracker with AI that supports 86 sports — as a solo developer
Dev.to