IN2LAMA: INertial lidar localisation and mapping

Publication Type:
Conference Proceeding
Citation:
Proceedings - IEEE International Conference on Robotics and Automation, 2019, 2019-May pp. 6388 - 6394
Issue Date:
2019-05-01
Filename Description Size
in2lama_inertial_lidar_localisation_and_mapping (1).pdfAccepted Manuscript5.27 MB
Adobe PDF
Full metadata record
© 2019 IEEE. In this paper, we introduce a probabilistic framework for INertial Lidar Localisation And MApping (IN2LAMA). Most of today's lidars are based on spinning mechanisms that do not capture snapshots of the environment. As a result, movement of the sensor can occur while scanning. Without a good estimation of this motion, the resulting point clouds might be distorted. In the lidar mapping literature, a constant velocity motion model is commonly assumed. This is an approximation that does not necessarily always hold. The key idea of the proposed framework is to exploit preintegrated measurements over upsampled inertial data to handle motion distortion without the need for any explicit motion-model. It tightly integrates inertial and lidar data in a batch on-manifold optimisation formulation. Using temporally precise upsampled preintegrated measurement allows frame-to-frame planar and edge features association. Moreover, features are re-computed when the estimate of the state changes, consolidating front-end and back-end interaction. We validate the effectiveness of the approach through simulated and real data.
Please use this identifier to cite or link to this item: