FullCircle: Effortless 3D Reconstruction from Casual 360$^\circ$ Captures

arXiv cs.CV / 3/25/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper presents “FullCircle,” a practical radiance-field-based pipeline for 3D scene reconstruction using raw 360° camera footage without special capture protocols or pre-processing.
  • It addresses a key limitation of existing approaches by leveraging the broader coverage of 360° cameras to improve viewpoint coverage and feature correspondences for reliable reconstruction.
  • The method is designed to be robust to a common real-world failure source: the human operator visible in 360° imagery.
  • To support fair comparison and evaluation, the authors release a multi-tier dataset of scenes captured as raw dual-fisheye images as a benchmark for casual 360° reconstruction.
  • Results indicate the approach substantially outperforms both vanilla 3D Gaussian Splatting adapted for 360° cameras and simulated perspective-camera baselines under matched capture conditions.

Abstract

Radiance fields have emerged as powerful tools for 3D scene reconstruction. However, casual capture remains challenging due to the narrow field of view of perspective cameras, which limits viewpoint coverage and feature correspondences necessary for reliable camera calibration and reconstruction. While commercially available 360^\circ cameras offer significantly broader coverage than perspective cameras for the same capture effort, existing 360^\circ reconstruction methods require special capture protocols and pre-processing steps that undermine the promise of radiance fields: effortless workflows to capture and reconstruct 3D scenes. We propose a practical pipeline for reconstructing 3D scenes directly from raw 360^\circ camera captures. We require no special capture protocols or pre-processing, and exhibit robustness to a prevalent source of reconstruction errors: the human operator that is visible in all 360^\circ imagery. To facilitate evaluation, we introduce a multi-tiered dataset of scenes captured as raw dual-fisheye images, establishing a benchmark for robust casual 360^\circ reconstruction. Our method significantly outperforms not only vanilla 3DGS for 360^\circ cameras but also robust perspective baselines when perspective cameras are simulated from the same capture, demonstrating the advantages of 360^\circ capture for casual reconstruction. Additional results are available at: https://theialab.github.io/fullcircle