logo Idiap Research Institute        
 [BibTeX] [Marc21]
Personality Trait Classification via Co-Occurrent Multiparty Multimodal Event Discovery
Type of publication: Conference paper
Citation: Okada_ICMI_2015
Publication status: Published
Booktitle: Proceedings of the ACM International Conference on Multimodal Interaction
Series: ICMI '15
Year: 2015
Month: November
Pages: 15-22
Publisher: ACM
Location: Seattle, Washington, USA
ISBN: 978-1-4503-3912-4
DOI: 10.1145/2818346.2820757
Abstract: This paper proposes a novel feature extraction framework from mutli-party multimodal conversation for inference of personality traits and emergent leadership. The proposed framework represents multi modal features as the combination of each participant’s nonverbal activity and group activity. This feature representationenables to compare the nonverbal patterns extracted from the participants of different groups in a metric space. It captures how the target member outputs nonverbal behavior observed in a group (e.g. the member speaks while all members move their body), and can be available for any kind of multiparty conversation task. Frequent co-occurrent events are discovered using graph clustering from multimodal sequences. The proposed framework is applied for the ELEA corpus which is an audio visual dataset collected from groupmeetings. We evaluate the framework for binary classification task of 10 personality traits. Experimental results show that the model trained with co-occurrence features obtained higher accuracy than previously related work in 8 out of 10 traits. In addition, the co-occurrence features improve the accuracy from 2% up to 17%.
Projects Idiap
Authors Okada, Shogo
Aran, Oya
Gatica-Perez, Daniel
Added by: [UNK]
Total mark: 0
  • Okada_ICMI_2015.pdf