3D Lidar-IMU Calibration Based on Upsampled Preintegrated Measurements for Motion Distortion Correction

Publication Type:
Conference Proceeding
Proceedings - IEEE International Conference on Robotics and Automation, 2018, pp. 2149 - 2155
Issue Date:
Full metadata record
© 2018 IEEE. In this paper, we present a probabilistic framework to recover the extrinsic calibration parameters of a lidar-IMU sensing system. Unlike global-shutter cameras, lidars do not take single snapshots of the environment. Instead, lidars collect a succession of 3D-points generally grouped in scans. If these points are assumed to be expressed in a common frame, this becomes an issue when the sensor moves rapidly in the environment causing motion distortion. The fundamental idea of our proposed framework is to use preintegration over interpolated inertial measurements to characterise the motion distortion in each lidar scan. Moreover, by using a set of planes as a calibration target, the proposed method makes use of lidar point-to-plane distances to jointly calibrate and localise the system using on-manifold optimisation. The calibration does not rely on a predefined target as arbitrary planes are detected and modelled in the first lidar scan. Simulated and real data are used to show the effectiveness of the proposed method.
Please use this identifier to cite or link to this item: