Detection-Driven Exposure-Correction Network for Nighttime Drone-View Object Detection

Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Publication Type:
Journal Article
Citation:
IEEE Transactions on Geoscience and Remote Sensing, 2024, 62, pp. 1-14
Issue Date:
2024-01-01
Full metadata record
Drone-view object detection (DroneDet) models typically suffer a significant performance drop when applied to nighttime scenes. Existing solutions attempt to employ an exposure-adjustment module to reveal objects hidden in dark regions before detection. However, most exposure-adjustment models are only optimized for human perception, where the exposure-adjusted images may not necessarily enhance recognition. To tackle this issue, we propose a novel Detection-driven Exposure-correction network for nighttime DroneDet, called DEDet. The DEDet conducts adaptive, nonlinear adjustment of pixel values in a spatially fine-grained manner to generate DroneDet-friendly images. Specifically, we develop a fine-grained parameter predictor (FPP) to estimate pixelwise parameter maps of the image filters. These filters, along with the estimated parameters, are used to adjust pixel values of the low-light image based on nonuniform illuminations in drone-captured images. In order to learn the nonlinear transformation from the original nighttime images to their DroneDet-friendly counterparts, we propose a progressive filtering module that applies recursive filters to iteratively refine the exposed image. Furthermore, to evaluate the performance of the proposed DEDet, we have built a dataset NightDrone to address the scarcity of the datasets specifically tailored for this purpose. Extensive experiments conducted on four nighttime datasets show that DEDet achieves a superior accuracy compared with the state-of-the-art (SOTA) methods. Furthermore, ablation studies and visualizations demonstrate the validity and interpretability of our approach. Our NightDrone dataset can be downloaded from https://github.com/yuexiemail/NightDrone-Dataset.
Please use this identifier to cite or link to this item: