Prior-based Domain Adaptive Object Detection for Adverse Weather Conditions
Adverse weather conditions such as rain and haze corrupt the quality of captured images, which cause detection networks trained on clean images to perform poorly on these images. To address this issue, we propose an unsupervised prior-based domain adversarial object detection framework for adapting the detectors to different weather conditions. We make the observations that corruptions due to different weather conditions (i) follow the principles of physics and hence, can be mathematically modeled, and (ii) often cause degradations in the feature space leading to deterioration in the detection performance. Motivated by these, we propose to use weather-specific prior knowledge obtained using the principles of image formation to define a novel prior-adversarial loss. The prior-adversarial loss used to train the adaptation process aims to produce weather-invariant features by reducing the weather-specific information in the features, thereby mitigating the effects of weather on the detection performance. Additionally, we introduce a set of residual feature recovery blocks in the object detection pipeline to de-distort the feature space, resulting in further improvements. The proposed framework outperforms all existing methods by a large margin when evaluated on different datasets such as Foggy-Cityscapes, Rainy-Cityscapes, RTTS and UFDD.
READ FULL TEXT