Symmetric evaluation of multimodal human-robot interaction with gaze and standard control

Publisher:
MDPI
Publication Type:
Journal Article
Citation:
Symmetry, 2018, 10, (12)
Issue Date:
2018-12-01
Full metadata record
© 2018 by the authors. Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone (p < 0.05) and was competitive with the controller only setup, although did not outperform it (p > 0.05).
Please use this identifier to cite or link to this item: