A Tightly-Coupled Event-Inertial Odometry using Exponential Decay and Linear Preintegrated Measurements

Publisher:
Institute of Electrical and Electronics Engineers (IEEE)
Publication Type:
Conference Proceeding
Citation:
IEEE International Conference on Intelligent Robots and Systems, 2022, 2022-October, pp. 9475-9482
Issue Date:
2022-01-01
Full metadata record
In this paper, we introduce an event-based visual odometry and mapping framework that relies on decaying event-based corners. Event cameras, unlike conventional cam-eras, can provide sensor data during high-speed motions or in scenes with high dynamic ranges. Rather than providing intensity information at a global shutter rate, events are trig-gered asynchronously depending on whether there is a change in brightness at the pixel location. This novel sensing paradigm calls for unconventional ego-motion estimation techniques to address these new challenges. The key aspect of our framework is the use of a continuous representation of inertial measurements to characterise the system's motion which accommodates the asynchronous nature of the event data while estimating a discrete state in an optimisation-based approach. The proposed method relies on corners extracted from events-only data and associates them with a spatio-temporal locality scheme based on exponential decay. Event tracks are then tightly coupled with temporally accurate preintegrated inertial measurements, allowing for the estimation of ego-motion and a sparse map. The proposed method is evaluated on the Event Camera Dataset showing performance against the state-of-art in event-based visual-inertial odometry.
Please use this identifier to cite or link to this item: