Paper ID: 2305.09996

Restoring Images Captured in Arbitrary Hybrid Adverse Weather Conditions in One Go

Ye-Cong Wan, Ming-Wen Shao, Yuan-Shuo Cheng, Yue-Xian Liu, Zhi-Yuan Bao

Adverse conditions typically suffer from stochastic hybrid weather degradations (e.g., rainy and hazy night), while existing image restoration algorithms envisage that weather degradations occur independently, thus may fail to handle real-world complicated scenarios. Besides, supervised training is not feasible due to the lack of a comprehensive paired dataset to characterize hybrid conditions. To this end, we have advanced the aforementioned limitations with two tactics: framework and data. First, we present a novel unified framework, dubbed RAHC, to Restore Arbitrary Hybrid adverse weather Conditions in one go. Specifically, our RAHC leverages a multi-head aggregation architecture to learn multiple degradation representation subspaces and then constrains the network to flexibly handle multiple hybrid adverse weather in a unified paradigm through a discrimination mechanism in the output space. Furthermore, we devise a reconstruction vectors aided scheme to provide auxiliary visual content cues for reconstruction, thus can comfortably cope with hybrid scenarios with insufficient remaining image constituents. Second, we construct a new dataset, termed HAC, for learning and benchmarking arbitrary Hybrid Adverse Conditions restoration. HAC contains 31 scenarios composed of an arbitrary combination of five common weather, with a total of ~316K adverse-weather/clean pairs. Extensive experiments yield superior results and establish new state-of-the-art results on both HAC and conventional datasets.

Submitted: May 17, 2023