Creating and annotating affect databases from face and body display: a contemporary survey

Publication Type:
Conference Proceeding
Proceedings of SMC 2006, 2006, pp. 2426 - 2433
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2006005297.pdf92.58 kB
Adobe PDF
Databases containing representative samples of human multi-modal expressive behavior are needed for the development of affect recognition systems. However, at present publicly-available databases exist mainly for single expressive modalities such as facial expressions, static and dynamic hand postures, and dynamic hand gestures. Only recently, a first bimodal affect database consisting of expressive face and upper-body display has been released. To foster development of affect recognition systems, this paper presents a comprehensive survey of the current state-of-the art in affect database creation from face and body display and elicits the requirements of an ideal multi-modal affect database
Please use this identifier to cite or link to this item: