Bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior

Publication Type:
Conference Proceeding
Citation:
Proceedings - International Conference on Pattern Recognition, 2006, 1 pp. 1148 - 1153
Issue Date:
2006-12-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2006004191.pdf199.16 kB
Adobe PDF
To be able to develop and test robust affective multimodal systems, researchers need access to novel databases containing representative samples of human multi-modal expressive behavior. The creation of such databases requires a major effort in the definition of representative behaviors, the choice of expressive modalities, and the collection and labeling of large amount of data. At present, public databases only exist for single expressive modalities such as facial expression analysis. There also exist a number of gesture databases of static and dynamic hand postures and dynamic hand gestures. However, there is not a readily available database combining affective face and body information in a genuine bimodal manner. Accordingly, in this paper, we present a bimodal database recorded by two high-resolution cameras simultaneously for use in automatic analysis of human nonverbal affective behavior. © 2006 IEEE.
Please use this identifier to cite or link to this item: