MoonAnything: A Vision Benchmark with Large-Scale Lunar Supervised Data

arXiv cs.CV / 4/2/2026

📰 NewsSignals & Early TrendsModels & Research

Key Points

  • The paper introduces MoonAnything, a large-scale lunar perception benchmark designed to overcome prior dataset gaps in geometric ground truth and photometric realism under diverse illumination.
  • MoonAnything provides two complementary sub-datasets: LunarGeo for stereo imagery with dense depth and camera calibration, and LunarPhoto for photorealistic rendering using a spatially-varying BRDF model plus multi-illumination data for reflectance and illumination-robust perception.
  • The benchmark is built from real lunar topography with physically-based rendering and delivers over 130K samples with comprehensive supervision covering 3D reconstruction, pose estimation, and reflectance estimation.
  • The authors report baseline results with state-of-the-art methods and emphasize that the dataset is a challenging testbed for low-texture, high-contrast environments, with potential generalization to other airless celestial bodies.
  • The full dataset and generation tools are released publicly to support community extension and algorithm development for lunar and related domains.

Abstract

Accurate perception of lunar surfaces is critical for modern lunar exploration missions. However, developing robust learning-based perception systems is hindered by the lack of datasets that provide both geometric and photometric supervision. Existing lunar datasets typically lack either geometric ground truth, photometric realism, illumination diversity, or large-scale coverage. In this paper, we introduce MoonAnything, a unified benchmark built on real lunar topography with physically-based rendering, providing the first comprehensive geometric and photometric supervision under diverse illumination with large scale. The benchmark comprises two complementary sub-datasets : i) LunarGeo provides stereo images with corresponding dense depth maps and camera calibration enabling 3D reconstruction and pose estimation; ii) LunarPhoto provides photorealistic images using a spatially-varying BRDF model, along with multi-illumination renderings under real solar configurations, enabling reflectance estimation and illumination-robust perception. Together, these datasets offer over 130K samples with comprehensive supervision. Beyond lunar applications, MoonAnything offers a unique setting and challenging testbed for algorithms under low-textured, high-contrast conditions and applies to other airless celestial bodies and could generalize beyond. We establish baselines using state-of-the-art methods and release the complete dataset along with generation tools to support community extension: https://github.com/clementinegrethen/MoonAnything.