Continuous Response to Music using Discrete Emotion Faces

Publisher:
Queen Mary University of London
Publication Type:
Conference Proceeding
Citation:
Proceedings of the 9th International Symposium on Computer Music Modelling and Retrieval, 2012, pp. 3 - 19
Issue Date:
2012-01
Filename Description Size
Thumbnail2012001225OK.pdf Published version1.4 MB
Adobe PDF
Full metadata record
An interface based on expressions in simple graphics of faces were aligned in a clock-like distribution with the aim of allowing participants to quickly and easily rate emotions in music continuously. We developed the interface and tested it using six extracts of music, one targeting each of the six faces: `Excited (at 1 oclock), `Happy (3), `Calm (5), `Sad (7), `Scared (9) and `Angry (11). 30 participants rated the emotion expressed by these excerpts on our `emotion-face-clock. By demonstrating how continuous category selections (votes) changed over time, we were able to show that (1) more than one emotion-face could be expressed by music at the same time and (2) the emotion face that best portrayed the emotion the music conveyed could change over time, and that the change could be attributed to changes in musical structure.
Please use this identifier to cite or link to this item: