Measuring Human Emotion in Short Documents to Improve Social Robot and Agent Interactions
- Publication Type:
- Conference Proceeding
- ADCAIJ : Advances in Distributed Computing and Artificial Intelligence Journal, 2019, 11489 LNAI, pp. 29-41
- Issue Date:
|Skillicorn2019_Chapter_MeasuringHumanEmotionInShortDo.pdf||Published version||582.89 kB|
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is new to OPUS and is not currently available.
© 2019, Springer Nature Switzerland AG. Social robots and agents can interact with people better if they can infer their affective state (emotions). While they cannot yet recognise affective state from tone and body language, they can use the fragments of speech that they (over)hear. We show that emotions – as conventionally framed – are difficult to detect. We suggest, from empirical results, that this is because emotions are the wrong granularity; and that emotions contain subemotions that are much more clearly separated from one another, and so are both easier to detect and to exploit.
Please use this identifier to cite or link to this item: