Environment-adaptive interaction primitives through visual context for human–robot motor skill learning
- Publication Type:
- Journal Article
- Autonomous Robots, 2019, 43 (5), pp. 1225 - 1240
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is currently unavailable due to the publisher's embargo.
The embargo period expires on 31 Aug 2020
© 2018, The Author(s). In situations where robots need to closely co-operate with human partners, consideration of the task combined with partner observation maintains robustness when partner behavior is erratic or ambiguous. This paper documents our approach to capture human–robot interactive skills by combining their demonstrative data with additional environmental parameters automatically derived from observation of task context without the need for heuristic assignment, as an extension to overcome shortcomings of the interaction primitives framework. These parameters reduce the partner observation period required before suitable robot motion can commence, while also enabling success in cases where partner observation alone was inadequate for planning actions suited to the task. Validation in a collaborative object covering exercise with a humanoid robot demonstrate the robustness of our environment-adaptive interaction primitives, when augmented with parameters directly drawn from visual data of the task scene.
Please use this identifier to cite or link to this item: