Handling Occlusion and Large Displacement Through Improved RGB-D Scene Flow Estimation

Publication Type:
Journal Article
IEEE Transactions on Circuits and Systems for Video Technology, 2016, 26 (7), pp. 1265 - 1278
Issue Date:
Full metadata record
© 1991-2012 IEEE. The accuracy of scene flow is restricted by several challenges such as occlusion and large displacement motion. When occlusion happens, the positions inside the occluded regions lose their corresponding counterparts in preceding and succeeding frames. Large displacement motion will increase the complexity of motion modeling and computation. Moreover, occlusion and large displacement motion are highly related problems in scene flow estimation, e.g., large displacement motion often leads to considerably occluded regions in the scene. An improved dense scene flow method based on red-green-blue-depth (RGB-D) data is proposed in this paper. To handle occlusion, we model the occlusion status for each point in our problem formulation, and jointly estimate the scene flow and occluded regions. To deal with large displacement motion, we employ an over-parameterized scene flow representation to model both the rotation and translation components of the scene flow, since large displacement motion cannot be well approximated using translational motion only. Furthermore, we employ a two-stage optimization procedure for this overparameterized scene flow representation. In the first stage, we propose a new RGB-D PatchMatch method, which is mainly applied in the RGB-D image space to reduce the computational complexity introduced by the large displacement motion. According to the quantitative evaluation based on the Middlebury data set, our method outperforms other published methods. The improved performance is also comprehensively confirmed on the real data acquired by Kinect sensor.
Please use this identifier to cite or link to this item: