Low-latency vision-based fiducial detection and localisation for object tracking
- Publication Type:
- Conference Proceeding
- ISARC 2017 - Proceedings of the 34th International Symposium on Automation and Robotics in Construction, 2017, pp. 706 - 711
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Real-time vision systems are widely-used in construction and manufacturing industries. A significant proportion of computational resources of such systems is used in fiducial identification and localisation for motion tracking of moving targets. The requirement is to localise a pattern in an image captured by the vision system precisely, accurately, and with a minimum available computation time. As such, this paper presents a class of patterns and, accordingly, proposes an algorithm to fulfil the requirement. Here, the patterns are designed using circular patches of concentric circles to increase the probability of detection and reduce cases of false detection. In the detection algorithm, the image captured by the vision system is first scaled down for computationally-effective processing. The scaled image is then separated by filtering only the colour components, which are made up of outer circular patches in the proposed pattern. A blob detection algorithm is then implemented for identifying inner circular patches. The inner circles are then localised in the image by using the colour information obtained. Finally, the localised pattern, along with the camera and distortion matrix of the vision system, is applied in a perspective-n-point solving algorithm to estimate the marker orientation and position in the global coordinate system. Our system shows significant enhancement in performance of fiducial detection and identification and achieves the required latency of less than ten milliseconds. Thus, it can be used for infrastructure monitoring in many applications that involve high-speed real-time vision systems.
Please use this identifier to cite or link to this item: