Depth Map Enhancement by Revisiting Multi-scale Intensity Guidance within Coarse-to-fine Stages

Publisher:
Institute of Electrical and Electronics Engineers (IEEE)
Publication Type:
Journal Article
Citation:
IEEE Transactions on Circuits and Systems for Video Technology, 2019, PP, (99), pp. 1-1
Issue Date:
2019-01-01
Filename Description Size
08945171.pdfPublished version9.55 MB
Adobe PDF
Full metadata record
IEEE Being different from the most methods of guided depth map enhancement based on deep convolutional neural network which focus on increasing the depth of networks, this paper is to improve the effectiveness of intensity guidance when the network goes deep. Overall, the proposed network upsamples the low-resolution depth maps from coarse to fine. Within each refinement stage of certain-scale depth features, the current-scale and all coarse-scales of the guidance features are revisited by dense connection. Therefore, the multi-scale guidance is efficiently maintained as the propagation of features. Furthermore, the proposed network maintains the intensity features in the high-resolution domain from which the multi-scale guidance is directly extracted. This design further improves the quality of intensity guidance. In addition, the shallow depth features upsampled via transposed convolution layer are directly transferred to the final depth features for reconstruction, which is called global residual learning in feature domain. Similarly, the global residual learning in pixel domain learns the difference between the depth ground truth and the coarsely upsampled depth map. Also, the local residual learning is to maintain the low frequency within each refinement stage and progressively recover the high frequency. The proposed method is tested for noise-free and noisy cases which compares against 16 state-of-the-art methods. Our experimental results show the improved performances based on the qualitative and quantitative evaluations.
Please use this identifier to cite or link to this item: