Environment-adaptive interaction primitives through visual context for human–robot motor skill learning

Publication Type:
Journal Article
Citation:
Autonomous Robots, 2019, 43 (5), pp. 1225 - 1240
Issue Date:
2019-06-15
Filename Description Size
paper_final_accepted_May18.pdfAccepted Manuscript Version6 MB
Adobe PDF
Full metadata record
© 2018, The Author(s). In situations where robots need to closely co-operate with human partners, consideration of the task combined with partner observation maintains robustness when partner behavior is erratic or ambiguous. This paper documents our approach to capture human–robot interactive skills by combining their demonstrative data with additional environmental parameters automatically derived from observation of task context without the need for heuristic assignment, as an extension to overcome shortcomings of the interaction primitives framework. These parameters reduce the partner observation period required before suitable robot motion can commence, while also enabling success in cases where partner observation alone was inadequate for planning actions suited to the task. Validation in a collaborative object covering exercise with a humanoid robot demonstrate the robustness of our environment-adaptive interaction primitives, when augmented with parameters directly drawn from visual data of the task scene.
Please use this identifier to cite or link to this item: