Paper ID: 2404.17503

Inhomogeneous illumination image enhancement under ex-tremely low visibility condition

Libang Chen, Jinyan Lin, Qihang Bian, Yikun Liu, Jianying Zhou

Imaging through dense fog presents unique challenges, with essential visual information crucial for applications like object detection and recognition obscured, thereby hindering conventional image processing methods. Despite improvements through neural network-based approaches, these techniques falter under extremely low visibility conditions exacerbated by inhomogeneous illumination, which degrades deep learning performance due to inconsistent signal intensities. We introduce in this paper a novel method that adaptively filters background illumination based on Structural Differential and Integral Filtering (SDIF) to enhance only vital signal information. The grayscale banding is eliminated by incorporating a visual optimization strategy based on image gradients. Maximum Histogram Equalization (MHE) is used to achieve high contrast while maintaining fidelity to the original content. We evaluated our algorithm using data collected from both a fog chamber and outdoor environments, and performed comparative analyses with existing methods. Our findings demonstrate that our proposed method significantly enhances signal clarity under extremely low visibility conditions and out-performs existing techniques, offering substantial improvements for deep fog imaging applications.

Submitted: Apr 26, 2024