New framework for simultaneous localization and mapping: Multi map SLAM

Publication Type:
Conference Proceeding
Citation:
Proceedings - IEEE International Conference on Robotics and Automation, 2008, pp. 1892 - 1897
Issue Date:
2008-09-18
Metrics:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2007002172.pdf650.86 kB
Adobe PDF
The main contribution of this paper arises from the development of a new framework, which has its inspiration in the mechanics of human navigation, for solving the problem of Simultaneous Localization and Mapping (SLAM). The proposed framework has specific relevance to vision based SLAM, in particular, small baseline stereo vision based SLAM and addresses several key issues relevant to the particular sensor domain. Firstly, as observed in the authors' earlier work, the particular sensing device has a highly nonlinear observation model resulting in inconsistent state estimations when standard recursive estimators such as the Extended Kalman Filter (EKF) or the Unscented variants are used. Secondly, vision based approaches tend to have issues related to large feature density, narrow field of view and the potential requirement of maintaining large databases for vision based data association techniques. The proposed Multi Map SLAM solution addresses the filter inconsistency issue by formulating the SLAM problem as a nonlinear batch optimization. Feature management is addressed through a two tier map representation. The two maps have unique attributes assigned to them. The Global Map (GM) is a compact global representation of the robots environment and the Local Map (LM) is exclusively used for low-level navigation between local points in the robot's navigation horizon. ©2008 IEEE.
Please use this identifier to cite or link to this item: