Artigo Revisado por pares

Analysis and Predictive Modeling of Body Language Behavior in Dyadic Interactions from Multimodal Interlocutor Cues

2014; Institute of Electrical and Electronics Engineers; Linguagem: Inglês

10.1109/tmm.2014.2328311

ISSN

1941-0077

Autores

Zhaojun Yang, Angeliki Metallinou, Shrikanth Narayanan,

Tópico(s)

Speech Recognition and Synthesis

Resumo

During dyadic interactions, participants adjust their behavior and give feedback continuously in response to the behavior of their interlocutors and the interaction context. In this paper, we study how a participant in a dyadic interaction adapts his/her body language to the behavior of the interlocutor, given the interaction goals and context. We apply a variety of psychology-inspired body language features to describe body motion and posture. We first examine the coordination between the dyad's behavior for two interaction stances: friendly and conflictive. The analysis empirically reveals the dyad's behavior coordination, and helps identify informative interlocutor features with respect to the participant's target body language features. The coordination patterns between the dyad's behavior are found to depend on the interaction stances assumed. We apply a Gaussian-Mixture-Model-based (GMM) statistical mapping in combination with a Fisher kernel framework for automatically predicting the body language of an interacting participant from the speech and gesture behavior of an interlocutor. The experimental results show that the Fisher kernel-based approach outperforms methods using only the GMM-based mapping, and using the support vector regression, in terms of correlation coefficient and RMSE. These results suggest a significant level of predictability of body language behavior from interlocutor cues.

Referência(s)
Altmetric
PlumX