Frequency-Dependent Depth Map Enhancement via Iterative Depth-Guided Affine Transformation and Intensity-Guided Refinement

Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Publication Type:
Journal Article
Citation:
IEEE Transactions on Multimedia, 2021, 23, pp. 772-783
Issue Date:
2021-01-01
Full metadata record
Recently, deep convolutional neural network sho-ws significant improvement for intensity-guided depth map enhancement. The most networks focus on either increasing depth or easing features propagation via residual learning and dense connection. However, it has not been explicitly considered yet to mitigate the artifacts caused by the differences of the distributions between the depth map and the corresponding color image, e.g., edge misalignment. In this paper, a novel depth-guided affine transformation is used to filter out the unrelated intensity features, which is further used to refine the depth features. Since the quality of initial depth features is low, the depth-guided intensity features filtering and the intensity-guided depth features refinement are iteratively performed, which progressively promotes effects of such tasks. To make full use of the iterations, all the refined depth features are dense connected followed by a 1 × 1 convolution layer. In addition, to improve the performance in the case of large upsampling factors (e.g., 16×), the depth features are enhanced from coarse to fine. In each frequency-dependent refinement of the depth features, the above iterative subnetwork as well as the residual learning are introduced. The proposed method is tested for the noise-free and noisy cases which compares against 16 state-of-The-Art methods. Our experimental results show the improved performances based on the qualitative and quantitative evaluations.
Please use this identifier to cite or link to this item: