MIG-net: Multi-scale Network Alternatively Guided by Intensity and Gradient Features for Depth Map Super-resolution

Publisher:
Institute of Electrical and Electronics Engineers
Publication Type:
Journal Article
Citation:
IEEE Transactions on Multimedia, 2022, 24, pp. 3506-3519
Issue Date:
2022-01-01
Full metadata record
The studies of previous decades have shown that the quality of depth maps can be significantly lifted by introducing the guidance from intensity images describing the same scenes. With the rising of deep convolutional neural network, the performance of guided depth map super-resolution is further improved. The variants always consider deep structure, optimized gradient flow and feature reusing. Nevertheless, it is difficult to obtain sufficient and appropriate guidance from intensity features without any prior. In fact, the features in gradient domain, e.g., edges, present strong correlations between the intensity image and the corresponding depth map. Therefore, the guidance in gradient domain can be more efficiently explored. In this paper, the depth features are iteratively upsampled by 2$\times$. In each upsampling stage, the low-quality depth features and the corresponding gradient features are iteratively refined by the guidance from the intensity features via two parallel streams. Then, to make full use of depth features in pixel and gradient domains, the depth features and gradient features are alternatively complemented with each other. Compared with state-of-the-art counterparts, the sufficient experimental results show improvements according to the objective and subjective assessments.
Please use this identifier to cite or link to this item: