HortiMulti: A Multi-Sensor Dataset for Localisation and Mapping in Horticultural Polytunnels

arXiv cs.RO / 3/23/2026

📰 NewsTools & Practical UsageModels & Research

Key Points

  • HortiMulti is a multimodal, cross-season dataset for localization and mapping in horticultural polytunnels collected across an entire growing season in commercial strawberry and raspberry environments.
  • The sensor suite includes two 3D LiDARs, four RGB cameras, an IMU, GNSS, and wheel odometry, with ground truth trajectories derived from Total Station surveying, AprilTag fiducials, and LiDAR-inertial odometry to support dense, sparse, and marker-free coverage.
  • The dataset captures substantial appearance variation, dynamic foliage, specular reflections from plastic covers, severe perceptual aliasing, and GNSS-unreliable conditions that directly degrade existing localization and perception algorithms.
  • Time-synchronized raw measurements, calibration files, reference trajectories, and baseline benchmarks for visual, LiDAR, and multi-sensor SLAM are released, and results show current state-of-the-art methods remain inadequate for reliable polytunnel deployment.
  • Overall, HortiMulti serves as a one-stop resource for developing and testing robotic perception systems in horticulture environments.

Abstract

Agricultural robotics is gaining increasing relevance in both research and real-world deployment. As these systems are expected to operate autonomously in more complex tasks, the availability of representative real-world datasets becomes essential. While domains such as urban and forestry robotics benefit from large and established benchmarks, horticultural environments remain comparatively under-explored despite the economic significance of this sector. To address this gap, we present HortiMulti, a multimodal, cross-season dataset collected in commercial strawberry and raspberry polytunnels across an entire growing season, capturing substantial appearance variation, dynamic foliage, specular reflections from plastic covers, severe perceptual aliasing, and GNSS-unreliable conditions, all of which directly degrade existing localisation and perception algorithms. The sensor suite includes two 3D LiDARs, four RGB cameras, an IMU, GNSS, and wheel odometry. Ground truth trajectories are derived from a combination of Total Station surveying, AprilTag fiducial markers, and LiDAR-inertial odometry, spanning dense, sparse, and marker-free coverage to support evaluation under both controlled and realistic conditions. We release time-synchronised raw measurements, calibration files, reference trajectories, and baseline benchmarks for visual, LiDAR, and multi-sensor SLAM, with results confirming that current state-of-the-art methods remain inadequate for reliable polytunnel deployment, establishing HortiMulti as a one-stop resource for developing and testing robotic perception systems in horticulture environments.