End-to-End Joint Intention Estimation for Shared Control Personal Mobility Navigation
- Publication Type:
- Conference Proceeding
- 16th IEEE International Conference on Control, Automation, Robotics and Vision, ICARCV 2020, 2020, 00, pp. 631-635
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is currently unavailable due to the publisher's embargo.
The embargo period expires on 8 Jan 2023
Advancements in technology propose a future where systems work collaboratively sharing the same workspace as humans. Navigation is one such crucial aspect of daily life where collaborative technologies can offer major assistance. Ageing population dictates a likely increase in personal mobility devices (PMDs), whilst autonomous cars are bringing intelligent vehicles to the road today. However, in such scenarios the expected assistance can only be given if the device is aware of its user's intention, so that controls can be applied in a tightly collaborative manner. Moreover, they should be robust to different environments, users and mobile platforms. A user driven navigation framework is proposed in this work to complement end-to-end sensing-only solutions to estimate controls as joint intention from vehicle states and user inputs. The solution is proven to be an improvement over similar strategies that rely on exteroceptive data and omit inputs from the driving agent. Furthermore, the developed framework is proven capable of transferring the learning into different environments and mobility platforms using a small amount of training data. Data from the autonomous driving community (Udacity dataset) and other obtained in-house with an instrumented power wheelchair are given to demonstrate the validity of the proposed approach.
Please use this identifier to cite or link to this item: