NimbusGS: Unified 3D Scene Reconstruction under Hybrid Weather
arXiv cs.CV / 3/31/2026
📰 NewsSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- NimbusGS is a unified 3D scene reconstruction framework designed to handle degraded multi-view inputs captured under mixed adverse weather, aiming for better cross-weather generalization than methods tuned to a single condition.
- The approach models weather as two components: a continuous, view-consistent transmission effect (light attenuation) and dynamic, view-dependent particulate residuals that drive scattering and occlusion.
- NimbusGS decomposes degradations into a global transmission field shared across views and per-view particulate residuals to disentangle static atmospheric effects from transient input-specific disturbances.
- To improve stability when visibility is severely degraded, it introduces a geometry-guided gradient scaling mechanism that reduces gradient imbalance during self-supervised optimization of 3D Gaussian representations.
- The paper reports that this physically grounded formulation preserves scene structure and produces superior geometry reconstruction, outperforming task-specific baselines across diverse weather scenarios, with code released on GitHub.
Related Articles

Black Hat Asia
AI Business
[D] How does distributed proof of work computing handle the coordination needs of neural network training?
Reddit r/MachineLearning

Claude Code's Entire Source Code Was Just Leaked via npm Source Maps — Here's What's Inside
Dev.to

BYOK is not just a pricing model: why it changes AI product trust
Dev.to

AI Citation Registries and Identity Persistence Across Records
Dev.to