NEC-Diff: Noise-Robust Event-RAW Complementary Diffusion for Seeing Motion in Extreme Darkness

arXiv cs.CV / 3/23/2026

📰 NewsModels & Research

Key Points

  • NEC-Diff introduces a diffusion-based framework that fuses raw (RAW) imagery with event camera data to reconstruct high-fidelity scenes in extreme darkness.
  • It leverages the linear light response of RAW images and the brightness-change cues from events to enforce a physics-driven, dual-modal denoising constraint.
  • The method dynamically estimates the SNR for both modalities and uses this to guide adaptive feature fusion within the diffusion process.
  • A new RAW-and-Event dataset, REAL, provides 47,800 pixel-aligned low-light RAW images, events, and high-quality references under lux levels 0.001–0.8.
  • Experiments show NEC-Diff achieves superior reconstruction in extreme darkness and the authors release code and dataset at the project GitHub.

Abstract

High-quality imaging of dynamic scenes in extremely low-light conditions is highly challenging. Photon scarcity induces severe noise and texture loss, causing significant image degradation. Event cameras, featuring a high dynamic range (120 dB) and high sensitivity to motion, serve as powerful complements to conventional cameras by offering crucial cues for preserving subtle textures. However, most existing approaches emphasize texture recovery from events, while paying little attention to image noise or the intrinsic noise of events themselves, which ultimately hinders accurate pixel reconstruction under photon-starved conditions. In this work, we propose NEC-Diff, a novel diffusion-based event-RAW hybrid imaging framework that extracts reliable information from heavily noisy signals to reconstruct fine scene structures. The framework is driven by two key insights: (1) combining the linear light-response property of RAW images with the brightness-change nature of events to establish a physics-driven constraint for robust dual-modal denoising; and (2) dynamically estimating the SNR of both modalities based on denoising results to guide adaptive feature fusion, thereby injecting reliable cues into the diffusion process for high-fidelity visual reconstruction. Furthermore, we construct the REAL (Raw and Event Acquired in Low-light) dataset which provides 47,800 pixel-aligned low-light RAW images, events, and high-quality references under 0.001-0.8 lux illumination. Extensive experiments demonstrate the superiority of NEC-Diff under extreme darkness. The project are available at: https://github.com/jinghan-xu/NEC-Diff.

NEC-Diff: Noise-Robust Event-RAW Complementary Diffusion for Seeing Motion in Extreme Darkness | AI Navigate