An Open-Source LiDAR and Monocular Off-Road Autonomous Navigation Stack
arXiv cs.RO / 4/6/2026
📰 NewsSignals & Early TrendsModels & Research
Key Points
- The paper introduces an open-source off-road autonomous navigation stack that supports both LiDAR-based and monocular 3D perception pipelines for obstacle detection in unstructured terrain.
- For the monocular approach, it uses zero-shot depth prediction from Depth Anything V2 and performs metric depth rescaling with sparse SLAM measurements via VINS-Mono, avoiding task-specific training.
- It improves robustness by applying edge-masking to reduce obstacle “hallucinations” from depth estimation and adding temporal smoothing to counter SLAM instability.
- The produced point cloud is converted into a robot-centric 2.5D elevation map used for costmap-based planning.
- Evaluations in Isaac Sim and real-world environments show the monocular setup can match high-resolution LiDAR performance in most scenarios, and the authors open-source the stack and simulation environment for reproducible benchmarking.




