Modality Coupling for Privacy Image Classification
- Publisher:
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Publication Type:
- Journal Article
- Citation:
- IEEE Transactions on Information Forensics and Security, 2023, 18, pp. 4843-4853
- Issue Date:
- 2023-01-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
64. TIFS. 2023. Yonggang.pdf | Published version | 16.05 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Privacy image classification (PIC) has attracted increasing attention as it can help people make appropriate privacy decisions when sharing images. Most recently, some pioneer research efforts have been made to utilize multimodal information for PIC, since multi-modality can provide richer information than single modality. Those research efforts on multimodal PIC are under the assumption of independently identically distribution. However, connections between different modalities commonly exist in real-world cases. Taking the modalities of scene and object as example, in the scene of 'library/indoor', the object 'book jacket' resides with high probabilities. To this end, in this paper, a novel PIC approach, called CoupledPIC, is proposed to bridge this gap by comprehensively capturing the coupling relations between different modalities. In CoupledPIC, two submodules are designed to capture explicit and implicit coupling relations between different modalities respectively. The explicit modality coupling is learned with a tensor fusion networks based submodule, via the direct interaction of features. For the implicit modality coupling, a graph convolutional networks based submodule is proposed to learn on both the initial graphs and attention guided graphs, via information aggregation on graphs. Extensive experiments on the public benchmark, PicAlert, demonstrate the effectiveness of the proposed CoupledPIC, yielding significant improvement by modeling inter-modality coupling information.
Please use this identifier to cite or link to this item: