Towards a minimal representation of affective gestures

Details

Serval ID
serval:BIB_A8110937502C
Type
Article: article from journal or magazin.
Collection
Publications
Title
Towards a minimal representation of affective gestures
Journal
IEEE Transactions on Affective Computing
Author(s)
Glowinski D., Dael N., Camurri A., Volpe G., Mortillaro M., Scherer K. R.
Publication state
Published
Issued date
03/2011
Peer-reviewed
Oui
Volume
2
Number
2
Pages
106-118
Language
english
Abstract
This paper presents a framework for analysis of affective behavior starting with a reduced amount of visual information related to human upper-body movements. The main goal is to individuate a minimal representation of emotional displays based on nonverbal gesture features. The GEMEP (Geneva multimodal emotion portrayals) corpus was used to validate this framework. Twelve emotions expressed by 10 actors form the selected data set of emotion portrayals. Visual tracking of trajectories of head and hands were performed from a frontal and a lateral view. Postural/shape and dynamic expressive gesture features were identified and analyzed. A feature reduction procedure was carried out, resulting in a 4D model of emotion expression that effectively classified/grouped emotions according to their valence (positive, negative) and arousal (high, low). These results show that emotionally relevant information can be detected/measured/obtained from the dynamic qualities of gesture. The framework was implemented as software modules (plug-ins) extending the EyesWeb XMI Expressive Gesture Processing Library and is going to be used in user centric, networked media applications, including future mobiles, characterized by low computational resources, and limited sensor systems.
Keywords
Emotion, expressive gesture, automatic features extraction
Create date
30/08/2011 16:48
Last modification date
20/08/2019 15:12
Usage data