Indexed by:
Abstract:
Unmanned aerial vehicle (UAV)-based visual systems suffer from poor perception at nighttime. There are three challenges for enlightening nighttime vision for UAVs: First, the UAV nighttime images differ from underexposed images in the statistical characteristic, limiting the performance of general low-light image enhancement (LLIE) methods. Second, when enlightening nighttime images, the artifacts tend to be amplified, distracting the visual perception of UAVs. Third, due to the inherent scarcity of paired data in the real world, it is difficult for UAV nighttime vision to benefit from supervised learning. To meet these challenges, we propose a zero-referenced enlightening and restoration network (ZERNet) for improving the perception of UAV vision at nighttime. Specifically, by estimating the nighttime enlightening map (NE-map), a pixel-to-pixel transformation is then conducted to enlighten the dark pixels while suppressing overbright pixels. Furthermore, we propose the self-regularized restoration to preserve the semantic contents and restrict the artifacts in the final result. Finally, our method is derived from zero-referenced learning, which is free from paired training data. Comprehensive experiments show that the proposed ZERNet effectively improves the nighttime visual perception of UAVs on quantitative metrics, qualitative comparisons, and application-based analysis.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS
ISSN: 1545-598X
Year: 2024
Volume: 21
4 . 0 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count: 1
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: