Towards better graph representation: Two-branch collaborative graph neural networks for multimodal marketing intention detection
- Publisher:
- IEEE
- Publication Type:
- Conference Proceeding
- Citation:
- Proceedings - IEEE International Conference on Multimedia and Expo, 2020, 2020-July, pp. 1-6
- Issue Date:
- 2020-07-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
09102918.pdf | Published version | 522.98 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2020 IEEE. Inspired by the fact that spreading and collecting information through the Internet becomes the norm, more and more people choose to post for-profit contents (images and texts) in social networks. Due to the difficulty of network censors, malicious marketing may be capable of harming the society. Therefore, it is meaningful to detect marketing intentions online automatically. However, gaps between multimodal data make it difficult to fuse images and texts for content marketing detection. To this end, this paper proposes Two-Branch Collaborative Graph Neural Networks to collaboratively represent multimodal data by Graph Convolution Networks (GCNs) in an end-to-end fashion. We first separately embed groups of images and texts by GCNs layers from two views and further adopt the proposed multimodal fusion strategy to learn the graph representation collaboratively. Experimental results demonstrate that our proposed method achieves superior graph classification performance for marketing intention detection.
Please use this identifier to cite or link to this item: