NimbusGS: Unified 3D Scene Reconstruction under Hybrid Weather

arXiv cs.CV / 3/31/2026

📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • NimbusGS is a unified 3D scene reconstruction framework designed to handle degraded multi-view inputs captured under mixed adverse weather, aiming for better cross-weather generalization than methods tuned to a single condition.
  • The approach models weather as two components: a continuous, view-consistent transmission effect (light attenuation) and dynamic, view-dependent particulate residuals that drive scattering and occlusion.
  • NimbusGS decomposes degradations into a global transmission field shared across views and per-view particulate residuals to disentangle static atmospheric effects from transient input-specific disturbances.
  • To improve stability when visibility is severely degraded, it introduces a geometry-guided gradient scaling mechanism that reduces gradient imbalance during self-supervised optimization of 3D Gaussian representations.
  • The paper reports that this physically grounded formulation preserves scene structure and produces superior geometry reconstruction, outperforming task-specific baselines across diverse weather scenarios, with code released on GitHub.

Abstract

We present NimbusGS, a unified framework for reconstructing high-quality 3D scenes from degraded multi-view inputs captured under diverse and mixed adverse weather conditions. Unlike existing methods that target specific weather types, NimbusGS addresses the broader challenge of generalization by modeling the dual nature of weather: a continuous, view-consistent medium that attenuates light, and dynamic, view-dependent particles that cause scattering and occlusion. To capture this structure, we decompose degradations into a global transmission field and per-view particulate residuals. The transmission field represents static atmospheric effects shared across views, while the residuals model transient disturbances unique to each input. To enable stable geometry learning under severe visibility degradation, we introduce a geometry-guided gradient scaling mechanism that mitigates gradient imbalance during the self-supervised optimization of 3D Gaussian representations. This physically grounded formulation allows NimbusGS to disentangle complex degradations while preserving scene structure, yielding superior geometry reconstruction and outperforming task-specific methods across diverse and challenging weather conditions. Code is available at https://github.com/lyy-ovo/NimbusGS.