Human activity recognition for domestic robots

Publication Type:
Chapter
Citation:
2015, 105 pp. 395 - 408
Issue Date:
2015-01-01
Filename Description Size
ThumbnailHuman activity detection for domestic robots.pdfAccepted Manuscript version1.03 MB
Adobe PDF
Full metadata record
© Springer International Publishing Switzerland 2015. Capabilities of domestic service robots could be further improved, if the robot is equipped with an ability to recognize activities performed by humans in its sensory range. For example in a simple scenario a floor cleaning robot can vacuum the kitchen floor after recognizing human activity "cooking in the kitchen". Most of the complex human activities can be sub divided into simple activities which can later used for recognize complex activities. Activities like "take meditation" can be sub divided into simple activities like "opening pill container" and "drinking water". However, even recognizing simple activities are highly challenging due to the similarities between some inter activities and dissimilarities of intra activities which are performed by different people, body poses and orientations. Even a simple human activity like "drinking water" can be performed while the subject is in different body poses like sitting, standing or walking. Therefore building machine learning techniques to recognize human activities with such complexities is non trivial. To address this issue, we propose a human activity recognition technique that uses 3D skeleton features produced by a depth camera. The algorithm incorporates importance weights for skeleton 3D joints according to the activity being performed. This allows the algorithm to ignore the confusing or irrelevant features while relying on informative features. Later these joints were ensembled together to train Dynamic Bayesian Networks (DBN), which is then used to infer human activities based on likelihoods. The proposed activity recognition technique is tested on a publicly available dataset and UTS experiments with overall accuracies of 85% and 90%.
Please use this identifier to cite or link to this item: