Emergent leaders through looking and speaking: From audio-visual data to multimodal recognition

Details

Serval ID
serval:BIB_B1AF11D5188F
Type
Article: article from journal or magazin.
Collection
Publications
Title
Emergent leaders through looking and speaking: From audio-visual data to multimodal recognition
Journal
Journal on Multimodal User Interfaces
Author(s)
Sanchez-Cortes D., Aran O., Jayagopi D.B., Schmid Mast M., Gatica-Perez D.
ISSN
1783-7677
Publication state
Published
Issued date
03/2013
Peer-reviewed
Oui
Volume
7
Number
1/2
Pages
39-53
Language
english
Abstract
In this paper we present a multimodal analysis of emergent leadership in small groups using audio-visual features and discuss our experience in designing and collecting a data corpus for this purpose. The ELEA Audio-Visual Synchronized corpus (ELEA AVS) was collected using a light portable setup and contains recordings of small group meetings. The participants in each group performed the winter survival task and filled in questionnaires related to personality and several social concepts such as leadership and dominance. In addition, the corpus includes annotations on participants' performance in the survival task, and also annotations of social concepts from external viewers. Based on this corpus, we present the feasibility of predicting the emergent leader in small groups using automatically extracted audio and visual features, based on speaking turns and visual attention, and we focus specifically on multimodal features that make use of the looking at participants while speaking and looking at while not speaking measures. Our findings indicate that emergent leadership is related, but not equivalent, to dominance, and while multimodal features bring a moderate degree of effectiveness in inferring the leader, much simpler features extracted from the audio channel are found to give better performance.
Keywords
Emergent leadership, Nonverbal behavior, Multimodal cues, Small group interactions
Web of science
Create date
19/11/2014 15:12
Last modification date
20/08/2019 16:20
Usage data