Towards a minimal representation of affective gestures

Détails

ID Serval
serval:BIB_A8110937502C
Type
Article: article d'un périodique ou d'un magazine.
Collection
Publications
Titre
Towards a minimal representation of affective gestures
Périodique
IEEE Transactions on Affective Computing
Auteur(s)
Glowinski D., Dael N., Camurri A., Volpe G., Mortillaro M., Scherer K. R.
Statut éditorial
Publié
Date de publication
03/2011
Peer-reviewed
Oui
Volume
2
Numéro
2
Pages
106-118
Langue
anglais
Résumé
This paper presents a framework for analysis of affective behavior starting with a reduced amount of visual information related to human upper-body movements. The main goal is to individuate a minimal representation of emotional displays based on nonverbal gesture features. The GEMEP (Geneva multimodal emotion portrayals) corpus was used to validate this framework. Twelve emotions expressed by 10 actors form the selected data set of emotion portrayals. Visual tracking of trajectories of head and hands were performed from a frontal and a lateral view. Postural/shape and dynamic expressive gesture features were identified and analyzed. A feature reduction procedure was carried out, resulting in a 4D model of emotion expression that effectively classified/grouped emotions according to their valence (positive, negative) and arousal (high, low). These results show that emotionally relevant information can be detected/measured/obtained from the dynamic qualities of gesture. The framework was implemented as software modules (plug-ins) extending the EyesWeb XMI Expressive Gesture Processing Library and is going to be used in user centric, networked media applications, including future mobiles, characterized by low computational resources, and limited sensor systems.
Mots-clé
Emotion, expressive gesture, automatic features extraction
Création de la notice
30/08/2011 16:48
Dernière modification de la notice
20/08/2019 15:12
Données d'usage