LILOC: Enabling Precise 3D Localization in Dynamic Indoor Environments using LiDARs

Publisher:
Association for Computing Machinery (ACM)
Publication Type:
Conference Proceeding
Citation:
ACM International Conference Proceeding Series, 2023, pp. 158-171
Issue Date:
2023-05-09
Filename Description Size
IoTDI'23-LiLoc.pdf11.09 MB
Adobe PDF
Full metadata record
We present LiLoc, a system for precise 3D localization and tracking of mobile IoT devices (e.g., robots) in indoor environments using multi-perspective LiDAR sensing. The key differentiators in our work are: (a) First, unlike traditional localization approaches, our approach is robust to dynamically changing environmental conditions (e.g., varying crowd levels, object placement/layout changes); (b) Second, unlike prior work on visual and 3D SLAM, LiLoc is not dependent on a pre-built static map of the environment and instead works by utilizing dynamically updated point clouds captured from both infrastructural-mounted LiDARs and LiDARs equipped on individual mobile IoT devices. To achieve fine-grained, near real-time location tracking, it employs complex 3D ĝglobal' registration among the two point clouds only intermittently to obtain robust spot location estimates and further augments it with repeated simpler ĝlocal' registrations to update the trajectory of IoT device continuously. We demonstrate that LiLoc can (a) support accurate location tracking with location and pose estimation error being <=7.4cm and <=3.2° respectively for 84% of the time and the median error increasing only marginally (8%), for correctly estimated trajectories, when the ambient environment is dynamic, (b) achieve a 36% reduction in median location estimation error compared to an approach that uses only quasi-static global point cloud, and (c) obtain spot location estimates with a latency of only 973 msecs.
Please use this identifier to cite or link to this item: