Learning Mobility Aid Assistance via Decoupled Observation Models
- Publication Type:
- Conference Proceeding
- 2018 15th International Conference on Control, Automation, Robotics and Vision, ICARCV 2018, 2018, pp. 1903 - 1910
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is currently unavailable due to the publisher's embargo.
The embargo period expires on 30 Nov 2020
© 2018 IEEE. This paper presents an active assistance framework for mobility systems, such as Power Mobility Devices (PMD), with the distinctive goal of being able to operate within a local moving window, as opposed to the common reliance upon persistent global environments and objectives. Demonstration data from able experts driving a simulated mobility aid in a representative indoor setting is used off-line to build behavioral models of navigation postulated separately upon user joystick inputs and on-board sensor data. These models are built respectively via Gaussian Processes for the joystick signals, and a Deep Convolutional Neural Network for the sensor data; in this case a planar LIDAR. Their combined outputs form a continuous distribution of estimated traversal likelihood within the user's immediate space, allowing for real-time stochastic optimal path planning to guide a user to its intended local destination. Moreover, the computational efficiency of the decoupled models permits rapid replanning on-the-fly for a smooth assistive action. On-line and off-line evaluations substantiate the advantages of the framework in generalising intelligent navigational assistance, of particular relevance for users who experience difficulty in safe mobility.
Please use this identifier to cite or link to this item: