Sim2RealVS: A New Benchmark for Video Stabilization with a Strong Baseline

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2023, 00, pp. 5395-5404
Issue Date:
2023-02-06
Full metadata record
Video stabilization is highly desirable when videos undergo severe jittering artifacts The difficulty of obtaining sufficient training data obstructs the development of video stabilization In this work we address this issue by presenting a Sim2RealVS benchmark with more than 1 300 pairs of shaky and stable videos Our benchmark is curated by an in game simulator with diverse scenes and various jittering effects Moreover we propose a simple yet strong baseline approach named Motion Trajectory Smoothing Network MTSNet by fully exploiting our Sim2RealVS data Our MTSNet consists of three main steps motion estimation global trajectory smoothing and frame warping In motion estimation we design a Motion Correction and Completion MCC module to rectify the optical flow with low confidence such as in textureless regions thus providing more consistent motion estimation for next steps Benefiting from our synthetic data we can explicitly learn a Trajectory Smoothing Transformer TST with ground truth supervision to smooth global trajectories In training TST we propose two fully supervised losses i e a motion magnitude similarity loss and a motion tendency similarity loss After training our TST is able to produce smooth motion trajectories for the shaky input videos Extensive qualitative and quantitative results demonstrate that our MTSNet achieves superior performance on both synthetic and real world data
Please use this identifier to cite or link to this item: