Indexed by:
Abstract:
Camouflage object detection aims to detect highly concealed objects hidden in complex environments, and has important application value in many fields such as medicine and agriculture. The existing methods that combine boundary priors excessively emphasize boundary area and lack the ability to represent the internal information of camouflaged objects, resulting in inaccurate detection of the internal area of the camouflaged objects by the model. At the same time, existing methods lack effective mining of foreground features of camouflaged objects, resulting in the background area being mistakenly detected as camouflaged object. To address the above issues, this paper proposes a camouflage object detection method based on boundary feature fusion and foreground guidance, which consists of several stages such as feature extraction, boundary feature fusion, backbone feature enhancement and prediction. In the boundary feature fusion stage, the boundary features are first obtained through the boundary feature extraction module and the boundary mask is predicted. Then, the boundary feature fusion module effectively fuses the boundary features and boundary mask with the lowest level backbone features, thereby enhancing the camouflage object’s boundary position and internal region features. In addition, a foreground guidance module is designed to enhance the backbone features using the predicted camouflage object mask. The camouflage object mask predicted by the previous layer of features is used as the foreground attention of the current layer features, and performing spatial interaction on the features to enhance the network’s ability to recognize spatial relationships, thereby enabling the network to focus on fine and complete camouflage object areas. A large number of experimental results in this paper on four widely used benchmark datasets show that the proposed method outperforms the 19 mainstream methods compared, and has stronger robustness and generalization ability for camouflage object detection tasks. © 2024 Chinese Institute of Electronics. All rights reserved.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
Acta Electronica Sinica
ISSN: 0372-2112
Year: 2024
Issue: 7
Volume: 52
Page: 2279-2290
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: