Non-parametric consistency test for multiple-sensing-modality data fusion
- Publication Type:
- Conference Proceeding
- 2015 18th International Conference on Information Fusion, Fusion 2015, 2015, pp. 443 - 451
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
© 2015 IEEE. Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to catastrophic fusion in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.
Please use this identifier to cite or link to this item: