Zero-Shot Camouflaged Object Detection.
- Publisher:
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Publication Type:
- Journal Article
- Citation:
- IEEE Trans Image Process, 2023, 32, pp. 5126-5137
- Issue Date:
- 2023
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Zero-Shot_Camouflaged_Object_Detection.pdf | Published version | 6.85 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
The goal of Camouflaged object detection (COD) is to detect objects that are visually embedded in their surroundings. Existing COD methods only focus on detecting camouflaged objects from seen classes, while they suffer from performance degradation to detect unseen classes. However, in a real-world scenario, collecting sufficient data for seen classes is extremely difficult and labeling them requires high professional skills, thereby making these COD methods not applicable. In this paper, we propose a new zero-shot COD framework (termed as ZSCOD), which can effectively detect the never unseen classes. Specifically, our framework includes a Dynamic Graph Searching Network (DGSNet) and a Camouflaged Visual Reasoning Generator (CVRG). In details, DGSNet is proposed to adaptively capture more edge details for boosting the COD performance. CVRG is utilized to produce pseudo-features that are closer to the real features of the seen camouflaged objects, which can transfer knowledge from seen classes to unseen classes to help detect unseen objects. Besides, our graph reasoning is built on a dynamic searching strategy, which can pay more attention to the boundaries of objects for reducing the influences of background. More importantly, we construct the first zero-shot COD benchmark based on the COD10K dataset. Experimental results on public datasets show that our ZSCOD not only detects the camouflaged object of unseen classes but also achieves state-of-the-art performance in detecting seen classes.
Please use this identifier to cite or link to this item: