OmniLight: One Model to Rule All Lighting Conditions

arXiv cs.CV / 4/17/2026

📰 NewsDeveloper Stack & InfrastructureIdeas & Deep AnalysisModels & Research

Key Points

  • The paper presents a lighting-related image restoration study focused on improving performance under adverse conditions such as cast shadows and irregular illumination.
  • It compares a specialized ALN approach built on DINOLight against a unified generalized model called OmniLight trained across multiple datasets.
  • OmniLight uses a newly proposed Wavelet Domain Mixture-of-Experts (WD-MoE) design to better handle diverse lighting domains.
  • The authors analyze how data distribution affects specialized versus unified architectures in lighting restoration.
  • Both DINOLight-based and OmniLight methods achieved top-tier results across all three lighting tracks in the NTIRE 2026 Challenge, and the code is released on GitHub.

Abstract

Adverse lighting conditions, such as cast shadows and irregular illumination, pose significant challenges to computer vision systems by degrading visibility and color fidelity. Consequently, effective shadow removal and ALN are critical for restoring underlying image content, improving perceptual quality, and facilitating robust performance in downstream tasks. However, while achieving state-of-the-art results on specific benchmarks is a primary goal in image restoration challenges, real-world applications often demand robust models capable of handling diverse domains. To address this, we present a comprehensive study on lighting-related image restoration by exploring two contrasting strategies. We leverage a robust framework for ALN, DINOLight, as a specialized baseline to exploit the characteristics of each individual dataset, and extend it to OmniLight, a generalized alternative incorporating our proposed Wavelet Domain Mixture-of-Experts (WD-MoE) that is trained across all provided datasets. Through a comparative analysis of these two methods, we discuss the impact of data distribution on the performance of specialized and unified architectures in lighting-related image restoration. Notably, both approaches secured top-tier rankings across all three lighting-related tracks in the NTIRE 2026 Challenge, demonstrating their outstanding perceptual quality and generalization capabilities. Our codes are available at https://github.com/OBAKSA/Lighting-Restoration.